"The Data Diva" Talks Privacy Podcast

The Data Diva E34 - David Kruger and Debbie Reynolds

June 29, 2021 Debbie Reynolds Season 1 Episode 34
"The Data Diva" Talks Privacy Podcast
The Data Diva E34 - David Kruger and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, "The Data Diva,” talks to  David Kruger VP, Strategy, Co-Founder of Absio Corporation co-creator of Software-defined Distributed Key Cryptography.  We discuss David’s technology background, the problem of data control and data leakage,  the history of metadata and true data insights, changing the ways that we think about data protection,  Managing 3rd party data risk using software,  protecting data using location restrictions, the impact of the invalidation of The EU-US and Swiss Privacy Shield on data movements and security, The ability to store cryptography keys anywhere in the world, what is not being discussed and should be about data privacy and data protection, data control and data as a property rights,  redress for breaches is difficult, creeping data seepage, need for technology that engenders trust and his wish for data privacy in the future. 

Support the show

 

David_Kruger

 39:34

SUMMARY KEYWORDS

data, people, information, software, breached, control, metadata, decrypt, privacy, transfer, company, problem, physical, third party, Cybersecurity, keys, technology, invalidation, harm, world

SPEAKERS

David Kruger, Debbie Reynolds

 

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva." This is "The Data Diva" Talks Privacy podcast, where we discuss privacy issues with industry leaders around the world to information businesses need to know right now. So today, I'm very happy to have a special guest on my show. His name is David Kruger; he's the VP of strategy, co-inventor/co-founder of Absio Corporation. And he's also on the Forbes Technology Council. Hello. Hello, how are you doing?

 

David Kruger  00:49

I'm doing well; I'm doing well.

 

Debbie Reynolds  00:52

First of all, thank you for being on the show. I really appreciate you appearing on short notice. You have, you and I have been connected on LinkedIn. We comment on posts and different things. And you called me up, we had a meeting, and the meeting we had was mind-blowing. And I said I have to have on the show because some of the things that you talked about are just stunning. And I think it'd be great to share with the audience kind of your experience and what you're doing with Axio. So, to get started, why don't you tell me about your journey in technology? I think it's fascinating.

 

David Kruger  01:35

Well, my background is not actually in it. I didn't get into the IT world full time. I guess you would say until about 1999. Prior to that, I had been in process and transportation safety. But that had a software development aspect to it because I wrote software, I ran teams of developers that made software that we needed to do, process safety for big chemical companies, refineries, and things like that we were in charge of keeping things from blowing up, right. So there's a lot of engineering works that goes into that. So I've always been dabbling in software. But in 1999, I got into it full time with my twin brother; he had asked me to come and help them do an early Internet startup, which is actually still around. It started off as Construction News. And now, as a company that delivers construction plans and things like that to general commercial contractors over the Internet. We had started up another couple of companies and sold them in 2008. We wanted to work on the cybersecurity issue because we saw it as an issue that was not going to go away, and that company eventually became Axio Corporation, where I work now.

 

Debbie Reynolds  03:06

Excellent, excellent. Let's see. So I have so many questions I want to ask you. I guess let's talk about the problem of data sharing and data leaking. So the problem, as I see it, is that people's data as we use for all different types of purposes, there's data leakage, or there are data security issues with how that data is either being maintained or how it's being transferred around. And this is, to me, is the problem de jure, like how do we best protect the data of individuals that we have? And I feel like, in the conversation we had, you had some great answers to that. So why don't you tell me a little bit about your thought about this whole thing about protecting people's data?

 

David Kruger  04:06

Well, let me start back with the sort of the safety issue because this is actually safety. There's actually safety thinking, or safety engineering thinking is actually how we came up with the technology. And so, we've kind of come to understand that data, if it either, gets into the wrong hands, or if it's used in a way that the, whoever sort of as the steward of that data doesn't, doesn't intend and privacy, it's, your personal information for our company, it's their corporate information. We started with the concept that information is different that you have to separate components, right. digitized information is a physical thing that has the potential to be hazardous, and when I say physical, people kind of collapse. Information and digitized information do the same thing, but they're not. Right, you're when you digitize information, you're turning into a pattern of ones and zeros that are expressed either in magnetic particles or are some kind of electromagnetic radiation, light, radio, there, those things are the medium, they're the physical medium, right? For that information, with information separate. So if you send me an email, the information is what the words mean in my head. The digitized information is the physical way that they're stored and transported. And the way you control the thing that's a hazard is you control its physicality, its physical nature. So what we realized when we started thinking about the problem this way is that you go all the way back to the early 50s, at Bell Labs, where the first human usable information was digitized, it was put on tape, they could come back to that tape, they could, they could find a file, they could open it up, alter it, reuse it, save it again, and so forth. And that's all the controls they had on the data. The only controls that were on the data on the physical stuff, right, was a metadata that said, hey, this is my name. And this is where I'm located, so you can come back and find it. The problem with that is, is that when software makes data that way, it's completely uncontrollable once it's shared. Right. So if you send me an email, and let's say it's got an attachment on it, it says, hey, David, don't share this with anybody else. This is private information. Do you actually have any way to enforce that? Right, you're having to trust me that I'm not going to do something with your data that you don't want to so very simply. In a nutshell, what we did is we figured out how to cryptographically bind additional controls to the physical data objects.

 

Debbie Reynolds  07:01

Oh, wow. That's amazing. Yeah, metadata. Metadata. Data is kind of the reason where we have jobs, I guess. Yes. Metadata. So so interesting. So for people who don't understand, I hope people understand what metadata is. People call it data about data. So it is that tangential data that is stored with the information that may not be readily apparent on the face of information. So this, this becomes an issue also with images. So when you're taking photographs and things like that, it's collecting metadata about that photograph, like where it was taken, maybe even the camera that was taken on different things like that. So those things are very telling. And for people who can extract that information they can glean a lot of insights from it.

 

David Kruger  07:58

So they can see that information. That information is basically available to anybody who gets that data object, right? If they got it, they normally have the information. They have the metadata about the information. Right? So that's it. I mean, if you want to look at what is the root cause of these sort of privacy and Cybersecurity failures that we're currently dealing with, if you kind of take it down to its root causes of things safety people do, right? What's the root cause problem? It's because the software makes data that's uncontrollable, once it's shared, think if it's a Cybersecurity problem, as long as you can get possession of that data ops, as long as you can go in on somebody else's machine and make a copy of it, right and exfiltrate it to your machine, you can open up and do whatever you want to with it. You can alter it; you can send it to anybody in the world; those people can offer or send it to anybody in the world. That's what they call it a data breach that the copies of the data get out, and they're uncontrollable by the person who originally made the data. Same thing for privacy, but not necessarily in the context of a Cybersecurity of a data breach. But a lot of our privacy issues are around data being used in ways that we didn't intend for it to be used. Right?

 

Debbie Reynolds  09:24

Right.

 

David Kruger  09:24

So both of those are a manifestation of the inability to be able to control the use of data when it's on somebody else's device. Does that make sense?

 

Debbie Reynolds  09:36

Oh, absolutely. Absolutely. And then, I don't know what I always think of; the analogy I use about Cybersecurity is like people think of it like is a castle, right? So there's a castle that is going to defend the perimeter and hopes no one gets in but doesn't take into account that maybe the danger is already inside the gates. And then, once people are inside the gates, there are things that they can do with data or information. That is, like I said; it's already controllable. So, trying to find ways to protect the data in a way that makes it useless for someone who isn't authorized to use it. I think it's really important. So tell me a little bit about, about what you do that helps that that solves that problem.

 

David Kruger  10:33

Okay, so our first customer, and this is going to go exactly to the point you were just making, was the US Army Intelligence, right, that they had us use the technology, I'll go back a little bit more history, we had sold this other software company, and we did a couple of years of stealth engineering to see if this whole concept of making data control what was even possible. And our first customer was US Army Intelligence, where they wanted us to build a new secure tactical battlefield communication systems, the sort the use case was to assume that the system is already compromised, that bad guys are already in our network, they've already got stuff on our devices, our users have already been socially engineered, right, though their credentials are compromised assume that that is all already true. And now, keep that data secure. And because we had coalition partners back at the time, we had people that we're handling the data along with the US military, that we couldn't exactly do a great job of vetting them, right, and making sure that they were good guys, but we had to share the data with them anyway, They had to have a way to be able to actually change people's access to data and revoke their credentials, even if that data was on their devices, not ours, or is on their network, not ours. Right. So that was sort of the basic use case. And that's sort of inside out from defense-in-depth, the thing that you're talking about the castle is official, US government Cybersecurity policy, and it's called defense-in-depth, right? You treat it just like a castle with layers of defense, and you hope you keep the bad guys from getting to the defenseless data. But we just approach the problem from the other end. Why should the data be defenseless? Why shouldn't the data essentially be able to defend itself? Why shouldn't the data tell the software? This is how you can use me or wherever you can use me or on what devices you can use me, or how long you can use me. And you can only use me for these purposes. Why shouldn't the data dictate that? And that's that was sort of a from the, completely opposite approach of the defense in depth. Now, that doesn't mean that defense in depth isn't needed. It just means that it's not sufficient by itself to protect our data. And the news basically proves that point every day.

 

Debbie Reynolds  13:12

Right. So now, there is a big issue around the world as it relates to data privacy regulations about third-party data transfer. So we're seeing big companies like Apple and Google tried to shift some of their third-party data risk to third parties where they're limiting the data that they're giving to third parties, especially people who are doing marketing. And then they're asking those third parties to basically have a first-party consensual relationship with the customer. So asking a third party that would not have otherwise had to ask the customer for consent in the past would have to do it now. And I think a lot of that is because of a big feature of many of the data regulations around the world says that they're the first-party data holder, or the data controller has a responsibility about what happens to data that they get to third parties. And third parties also have some skin in the game in that regard. So this third-party data transfer issue is not going to go away because almost every company, I feel like, has to have some third-party relationship, right? Where there's no, there may be something that you can't do in your company that you need a third party to help you with. And then, you may get tangled in these data privacy regulations. So tell me about how your tool or the process that you go through would help a company that is trying to shore up its third-party data transfer mechanisms.

 

David Kruger  14:58

Well, okay, a couple of couple things. Use cases here. First of all, if you just look at it from, whether it's GDPR or CCPA if you look at it, the first party company has a duty of care to make sure that that data is kept private that it's not breached. Right. So, we accomplish that simply by, we enable software to encrypt the data from the moment it's created. And it's never decrypted again, except momentarily, when it has to be used, right? Unless you're doing a thing called homomorphic encryption, you have to decrypt the data in order to be able to process it. So then, every time it's when the use is done, it's re-encrypted with a new key. So that's just a native part of the above these software tools, that the liability comes in when you have to take that data and share it with somebody else. Right. And at that point in time, if you've got to transfer that data to somebody, in order to defray your own liability, you need to be able to prove that when you transported it to them. It was encrypted and decrypted on their side if they're going to do that, right. So contractually, if that data gets breached, and you don't have a track record of it was always encrypted, which was us, and we handed it off, and they decrypted it on their side, which our tools enable, then there's always, there can be this kind of finger-pointing thing, we had this breach where the data actually come from, that's really hard to tell, by the way. Because if the data is identical, again, the way data works are that it's transported by making a copy. Right? So you've got identical copies in the first party and an identical copy. And the third in the third party, there's really no way to tell where the data was breached from. Right, not with any accuracy, was it this is one way that a company could say, we handed this data to you off, you decrypted on your side. And on our side, it was always secure. The other way, which I think is probably going to begin to become more common, is that people who have to hand off data to other people are going to begin to start to figure out how we can work on this data jointly? Right? How can we kind of keep this data in a place where we can both secure it, and we can both control it? Right? So I think in the long term, that's probably a better solution for privacy, because then you never have the data, whether it's in your hands or my hands, so to speak. That's not secure, and its use is not controlled. So that's a long-term thing. But I think we're going to see more and more. Especially the larger companies saying, Let's cooperate in this on this data. Let's just not go on over a wall and hope nothing bad happens to it. That makes sense.

 

Debbie Reynolds  18:03

Right? So you're, what you're doing is you're protecting the data, whether it stays in place, or whether it travels someplace else, right?

 

David Kruger  18:16

Yeah, that's exactly right because the controls are bound to the data. Right? The controls go wherever the data goes. So the security goes with it. And the controls go with it because it becomes part of the data.

 

Debbie Reynolds  18:30

Yeah. So let's say I transfer data to a third party. I decide that maybe down the line, I no longer want to share that data. What do I do?

 

David Kruger  18:42

Well, if you've just transferred the data to them, you've just given them copies, and you're you don't have any further control of it. All you can do is hope that they do what they're supposed to do. You don't have any control over that, or you don't have any visibility of that. On the other hand, if your recipient company is using this type of technology for automated data security and the ability to control its use, right, that then you can agree upon rules that go with your data when you transfer to them. Right. So control is maintained. You're basically thinking of it like this, if you have a contractual arrangement with somebody, the terms of that contract can become part of the data. You can use it for these things. These are purposes that we've agreed upon and put into a contract. You can use it for this long you can use it in this way, but not this way. You can actually embody those rules in the data itself. So that the people that are using that can read, the software rather that's using that data can read those rules and obey them. All of this becomes possible when you start to think of data as a physical thing that you can control.

 

Debbie Reynolds  19:59

Right. You're telling me that, and in that situation, I will be able to say down the line, okay, I want to stop sharing data with these people. And then I can do that.

 

David Kruger  20:11

Well, sure, you can do it. Because if it's your data, you've given it to somebody else. That's got your rules in it, right? We know where that data is. And then we can issue, you can issue a command that the next time that that software goes online, right, though, that somebody that software, that's warehousing that data or processing, that data goes online, you can send the command that says, revoke the encryption keys. Right. So that the data object is still in their possession. But the data is unusable because there's no key for it.

 

Debbie Reynolds  20:47

Right. I think usability is a really key component, right? Because if someone, let's say someone breached an organization, and they breached data in this way, they wouldn't be able to have use of it, correct?

 

David Kruger  21:04

That's correct. I mean, it's not the physical data object that we're trying to protect. It's the information, right? If they say, if they get the object, but they can't get the information, do we care if they have the object?

 

Debbie Reynolds  21:23

Right. I love it. I love it. So recently, in the recent years and months, there has been well, actually, last year, there was the invalidation of the Privacy Shield Framework, which is a data transfer framework between the US and the EU. And also, the Swiss-US Privacy Shield shortly after that was also invalidated. So the reason or the reason behind part of validation is that those countries have been that, that the US didn't have what they consider it at an adequate level of protection of data for data of European Swiss citizens that came over to the US. And so there's been this huge issue now internationally, this trying to get sorted out right now about how, and when, and if these data transfers can start again, a lot of companies are relying very heavily on standard contract clauses, which are just paper promises, right about what to do with the data. And in part, the discussion that I'm having with people or people were having is about how encryption can help any situations where one of the biggest sticking points in this invalidation was that because of our national laws here, the data in the US are held by a US company, even even if it's out of the US can be taken for law enforcement purposes. And so there are many people in Europe who are thinking that may be, first of all, if they don't have to transfer the data, they don't want to, if they have to transfer the data, they'll either transfer kind of the smallest amount necessary. And then we have a fraction of people say, let's do a transfer. But let me hold the keys to the encrypted data in the EU or in Switzerland. What are your thoughts about it?

 

David Kruger  23:42

Well, oh, there's a ton here. So I'll back up and tell a little bit about what our underlying technology is and does. It's called Software-Defined Distributed Key Cryptography, right? It's a new type of cryptography, it uses standard FIPS compliant Cryptography protocols, but the way that we manage the keys is a little bit different. So in our architecture, the keys can be wherever they need to be. So if you need the data to be warehoused in the States and the keys to be held in France,  that's entirely possible. The software developer can put the keys wherever they need to be put so that that can solve that use case. The other things though, that are possible, because you can, you can put any kind of control metadata in, bind it to the information to the data that you want to, if a company and say when we don't pick a European country said we're going to let you use this data. But we're going to require that that data not travel outside of your corporate servers, right? Only you can have access to this data? Well, the IP addresses of those corporate servers are known information, right? That they're not a mystery, the corporation, that's the recipient of the data. It knows what its server IP address ranges are right; wherever that stored could be in there in place could be up in Azure or AWS; those are both normal facts. So you can actually say, we're going to transfer this data to you. We're even gonna let you keep the encryption keys. But we're going to bind data to this metadata to this data that says this information will only decrypt on your servers. Right, so they can use it, they can transfer it, but it's going to be transferred encrypted, and it will only decrypt on their known physical or virtual servers. So that way, you can transfer the data know that it's encrypted, and except when it, it has to be decrypted for us and know that if it's breached, right, it's not going to decrypt for somebody who breaches it, who's trying to open that data up on a device that doesn't have the correct range of IP addresses. Yeah, that makes sense.

 

Debbie Reynolds  26:22

Absolutely. So let's take it a bit further. This was stunning, so when we talked about which was, I love your company example. Give me an example. We're talking about a geographical location in terms of limiting the use to a geographical location?

 

David Kruger  26:41

Well, yeah, any way that you can describe, again, geographical information, that's just metadata. So there are a couple of different ways to do that; one of them is just GPS data. So before I decrypt, you have to send me a GPS feed, right, that says, this is where I am. And again, that's a known universe; you can say that I'm only going to decrypt in the United States. So that's that you can get that information from GPS. And everything in the world's got IP addresses, and IP addresses are tied to a particular geography. Right. So you can say this is only going to decrypt in, in the United States, let's say if it's a company, that's got trade secrets, right, and they only want this information, one of their base controls is this information, or it has military value, this information is only going to decrypt in the continental United States, that's an enforceable control that you can bind to the data using the Amsteel technology. Right. So you can control the geography, you can also control time, devices have clocks, right, you can make a call to, to an external clock and say, you can have this information for the next two hours. And after that, it's going to expire, and the software will then use our technologies, and our software tools will then delete the key and then delete the information because you only need it for the next couple of hours. But let me stop and think about that from a privacy perspective. How much information do we give to people that they actually only need for a short period of time?

 

Debbie Reynolds  28:13

Right.

 

David Kruger  28:15

Right? What if we can put a clock on that says you need this for the next day? And you can have it for the next day? But after that, it's going to go away? Wow, that kind of takes the opportunity off the table for them to do something with your data that you don't want them to do.

 

Debbie Reynolds  28:31

Right. Yes, that's very Mission Impossible. I like it.

 

David Kruger  28:36

Yeah, you're that pop ticking, and at the end of the boom, it's gone. Mr. Phelps, you have five minutes to use this data. ,

 

Debbie Reynolds  28:44

You have to add the smoke and the sound effects of the marketing department about that. Yeah. So what is it that people will say, it wasn't the best way for me to put this? What is the question that people aren't asking that they should be asking right now?

 

David Kruger  29:04

I think there's there's sort of two big questions, right? Well, first is actually just; again, it's just education. People have to get their heads wrapped around the fact that the data is actually controllable, that their information is actually something that they can control. Even when it's on somebody else's computer, right, or laptop or a tablet or phone that have to get themselves wrapped around, it is possible. Once they get that in their heads, that it's controllable. Right, then the next question is if they're using a piece of software, or a social media platform, or an application at work, or something they downloaded out of an app store, begin to ask themselves a question. Why am I using? or Why, more importantly, why am I paying for software that puts my data or my customers' data at risk when I don't have to? Right? So what we're kind of trying to foment is a little bit of a revolution and maybe a little bit of a rebellion as well, that just says, hey, it's my data. It's my rules. Right? And we're just trying to get people's heads wrapped around it that's possible. And that they should really begin to think that way from a privacy and security perspective. Why am I using software that puts my data at risk?

 

Debbie Reynolds  30:40

Right, right. And then too, I think, to me, this is going towards the future. People having their data, storing it like a bank for themselves, and then they get to choose what they share and what they don't share. And then they can revoke consent, or they need to.

 

David Kruger  31:01

Yeah, no, I at that point in time is an interesting legal question, because data is not really, it's not really we say it's my data right now. But it's really not because you can't; you can't own something you can't control. On the other hand, if you can control it, you genuinely own it. And if your data has commercial value, and you can control it, then you can charge for it.

 

Debbie Reynolds  31:28

Interesting, yes. Yeah. So then, right, so then, the idea of control then goes to almost like a property?

 

David Kruger  31:38

Right? Oh, yeah. I mean, at that point in time, if you can control it, and it becomes, I mean, you've got this huge corpus of law around ownership of tangible and intangible property. Right, the information that we put on a computer is both the tangible portion of it is the physical portion that we control. And of course, you have the information component, that's that that is separate from that, right. So there's a, there's a ton of law, around property usage and rights, I'm not going to give you my data, but you can lease it is only a conversation, that's possible if you can control the use of your property, and you can evict people from your lease, right, that aren't doing what they're supposed to. So, right now, we've had this long struggle with trying to figure out what a harm standard is. And that's very, very difficult to qualify and quantify, right. And that's that harm standard, we've been trying to develop it since the dawn of the Internet, and we haven't progressed as far as we'd like to. But if you move that over into the realm of I lent you my information for a specific use, now you're done with it, I can take it back, or I let you for a specific use, we have a contract for that, and you misuse that. Then that becomes a property matter. And we have a huge body of law around that. So it changes that it changes the commercial equation, my data, my rules, you want my data, pay me, and it changes how you can pursue redress for people who do things with your data that you did not want them to do that you did not permit them to do.

 

Debbie Reynolds  33:31

Yeah, the thing I like about what you're talking about is that it allows you to prevent the harm, right, prevent some of the harms, to begin with. So then you don't even have to be so tied up in the contract or about the redress part after the fact. Because we know that when data is breached of individuals, the harm can be catastrophic, right? Yeah, almost immediately immediate. So it's not something where you can; not everyone can pay a lawyer to go to court for years and years to try to fight cases about their identity and stuff like that. So I think that this is a great way to try to minimize that harm upfront. What are your thoughts?

 

David Kruger  34:23

Well, I think so. I mean, if you begin to go back to the if the, if the root cause of the problems that we have with digital privacy, in digital security, right, are all kind of generated by the fact that the data is uncontrollable. If you can control it from the moment it is created, secure it and can and keep it under control, then there's just a whole lot of little problems that we have now that that that begins to go away. , they just begin to go away. And frankly, I know this is sort of a self-serving statement. But frankly, when we look at it, the privacy and the Cybersecurity landscape, we can't figure out how to solve any of these problems without getting the data under control. Perimeter security is, defense in depth has its place, but it's not enough. If we don't, we don't get the data under control and have a way to exercise control, exercise ownership, and things like that. Frankly, I don't know how we solve the problems.

 

Debbie Reynolds  35:30

Right. I agree with that. I agree with that. So, if it was the world, if it was the world, according to David, right, and we did everything to say, what would be your wish for privacy, either, whether it be technology, law, anything, what what is your wish?

 

David Kruger  35:54

I would like to say that we have this little phrase we use, I'd like this to become real, my data, my rules, that that's what I would like for it to have real force and effect in the world because I think it would solve a lot of big, big problems.

 

Debbie Reynolds  36:13

Yeah, that's great, that's wonderful. I love that. It's true. I think a lot of people feel like they don't have any control. They don't know where their data is or is being used. A lot of times, I coined the phrase, data purpose jacking where the data is taken for one reason. And then, over time, whatever service that you're using, they are changing their features and their functionality. And eventually, they're changing their terms of service and how they're using the data. So they are really changing how your data is being used just incrementally.

 

David Kruger  36:55

Well, yeah, the tragedy of that, Debbie, is that there is information that really, we don't want people to not feel like they can trust the people that they're sharing your data with. Because that there are bad things that happen. I'll give you one real quick example. I'm a cancer survivor. I want my medical data to be, to be used to take care of big data analytics run against it and stuff like that because we have a familial history of the same kind of cancer. And I don't want other people to get this. I don't want my kids and grandkids to worry about. I want this to go into the research data for this particular kind of cancer, right. But I also want to know that that data can't, not won't, not because of a contract, but can't get used for something other than cancer research related in this example. So if you don't have ways to have enforceable control of your data, then the people tend to not trust the people that you're giving data to. They tend to hold data back. And that's not a good thing. We're all a lot better off if the use of our data is known and controllable so that we can actually share stuff that has the real benefit that has real merit and do it and not worry about it.

 

Debbie Reynolds  38:15

Yeah. I think when people have trust, they don't have a problem in sharing, right. Because they know that the data is being used for the purpose they say it is, and it's being done in a transparent fashion. So I think we need, bringing technologies to the market that engender trust, which I think is critical.

 

David Kruger  38:37

Yeah, no, absolutely. Otherwise, if you don't trust people that you're sharing stuff with, you don't share stuff that should be shared to maximize the benefit, right.

 

Debbie Reynolds  38:45

Absolutely. Absolutely. So well, wow, this is amazing. I'm so glad that we were able to do this. You and I had a call, and I could have just hit record right there because we were talking about the same things; it was fascinating, so well, thank you so much for being on the show. I really appreciate it. I appreciate the opportunity. Thank you, and we'll talk soon. Bye-bye.