"The Data Diva" Talks Privacy Podcast

The Data Diva E179 - Vikram Venkatasubramanian and Debbie Reynolds

April 09, 2024 Season 4 Episode 179
"The Data Diva" Talks Privacy Podcast
The Data Diva E179 - Vikram Venkatasubramanian and Debbie Reynolds
Show Notes Transcript

33:27

SUMMARY KEYWORDS

privacy, data, companies, cybersecurity, vikram, work, risk, people, home, fascinating, connected, devices, consumer, call, products, questionnaire, informs, group, apps, world

SPEAKERS

Debbie Reynolds, Vikram Venkatasubramanian, Debbie Reynolds


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show, Vikram Venkatasubramanian. He is the founder and CEO of Mandy Security. Welcome.


Vikram Venkatasubramanian  00:40

Thank you very much. Thanks for having me.


Debbie Reynolds  00:42

I can't introduce you like I don't know you; Vikram and I know each other very well. We got to know each other because we were collaborating as part of a huge international group that is doing a project with IEEE. The project is named The IEEE Cybersecurity for Next Generation Connectivity Systems. Part of that effort is split out into multiple different groups. So the group I'm the Chairperson of, the Human Centricity for Data Control and Flow In Context of a Person. I have gotten an opportunity to choose a Co-Chair, and Vikram was the first person I thought of.


Vikram Venkatasubramanian  01:24

Thank you for bringing me into that fold. It has been an absolute honor and mind blowing experience so far working together.


Debbie Reynolds  01:32

Yeah, you have such not only deep knowledge in cybersecurity, you care about people. So I think that really helps, and then you have so much drive and energy. So you really helped me so much. I'm really proud and thrilled about the work that we're doing. I think I believe people are very interested in the work that we're doing, and they see all the good things that are happening. I want to do a shout-out also to Vikas Malhotra; he's also been a guest on the podcast. So he's really been the driver behind this effort. It's really amazing. So we'll talk a little bit more about that project later because we definitely want to get more people involved. So before we go into that, please, please please introduce yourself. Tell people about what you're doing, your trajectory in your career, and why this issue of human centricity is so important to you.


Vikram Venkatasubramanian  02:31

Absolutely. Just by way of introduction, being a technologist for 25-plus years, I started my career in the telecom space. Actually, I started my career in the healthcare IT space but then moved to the telecom space to leverage the whole dot.com boom back in 1998 - 2000 and joined a high-flying startup in Silicon Valley. We went IPO with the 38th largest IPO on NASDAQ at that point; I was supposed to be retired by now. But what happened was the CEO committed accounting fraud, and the whole company went bankrupt. We ended up getting bought for a bag of peanuts by a very big telecom company. I spent almost a decade in the telecom space; I realized at that point that my career had a gradual corporate progression and I wanted to get out and start something of my own. But I didn't have any business jobs as more on the technical side. So let me go back to school full time for my MBA, so I quit my job, went to school full time for my MBA, I finished my MBA in one year. I came out and started a company again in the telecom space. Unfortunately, at that point, the year was 2008 2009, the worst ever time to start any kind of a company; no funding was available, so we worked on it for about a year and a half, as long as our credit cards would allow, and then we shut it down. But serendipity took effect. During that process, we got connected to some local cybersecurity companies out here in the Boston area. One of our contacts there said, hey, you seem to have some excellent ideas. Why don't you come join us now that you're shutting down your company. So I ended up in the cybersecurity space as a result, and I've been in the cybersecurity space all along, mostly in the enterprise cybersecurity space. So I've worked at pretty much all of the big enterprise cybersecurity companies, bringing products to market, partnerships to market building, go-to-market plans, executing in the field, etc. I got to mingle with some of the top cybersecurity brains in the world, and the dinner conversations were always almost around what they were doing to protect themselves in their home. I was very intrigued by that. When I learned what they were all doing, the extent to which they were taking cybersecurity in their home seriously, even as far as like seven or eight years back. I started to look into it a little bit more myself and started to gain a lot of the technical skills around that and when I investigated it in depth, essentially about Wireshark in my own home and looking at what is happening. It opened my eyes, and from that point on, it triggered a cybersecurity problem on a consumer basis, very unsolved, and people are really exposed. Now, we are in an era where the smart home is booming. It's not just smart home; the future is not just going to be devices in your home that are connected, but devices that are wearable, that are connected, devices that are implanted, that are connected devices, that are ambient, that are connected, that you're going to be interacting with all day long. So, in this world, how do you address the problems? That's essentially the origin story of how I got into consumer cybersecurity, specifically around privacy because very quickly, I came to the realization that you can't solve for security in the home without solving for privacy. 


Debbie Reynolds  05:38

That's a perfect, perfect roadway for what you're doing and what we're working on together. Let's talk about the connected home.


Vikram Venkatasubramanian  05:47

Yep.


Debbie Reynolds  05:47

So, one of the reasons why I decided to focus a lot of my attention on IoT and connected systems is because I did see a lot of the privacy issues that come up. I feel like a lot of people who are selling these products are selling the convenience, a cool factor, and all that stuff. I think that's great. Nothing wrong with that. But these technologies do come with downsides, and I feel like the risk is taken on by the consumer, where the consumer is not educated enough to know what they need to be doing and what they need to look for. So it becomes their problem. If something bad happens, you can't go back to the manufacturer and say, hey, you did this or did that. So tell me a little bit about this problem.


Vikram Venkatasubramanian  06:40

Exactly spot on Debbie. By the way, you're underselling yourself by saying that you're just looking into it. You're leading the world in thought leadership on this. That being said, yes, you're spot on, right? This is not about denying people the convenience of smart devices and apps. These things, I love them, I use them and they make my life better. It is about enabling informed choice. Would you let a stranger into your home and give them unlimited access? No, you wouldn't. Why? Because, you know there's intrinsic dangers to it. This is about bringing that same level of tension for the consumer around, bring these new devices, apps, etc. So the way I look at it is that it's not just devices at the attack surface of the home. It's comprised of four things, devices, apps, online services that you use, as well as the people in the home. All four of these need to be addressed in order to be able to address the smart home problem. I think the biggest thing people realize is that there's a reason why a lot of these devices are sometimes free or very cheap. The companies that are making and providing these devices or apps or online services are able to survive is that they are monetizing something. If it's free, you are the product; we've all heard that. We've also heard data is a new oil. There are some arguments for and against that. But either way, the fact that your data is extremely valuable, in that it informs not just how people use their products but also enables the company to monetize it as a commodity in itself, as an offering in itself, says a lot. People are suspicious about that thought, so let's just do the simple exercise: just go look up the valuations of all the publicly traded companies on the NASDAQ and Dow that sell your data and add them up and just see how big an industry it is. It's well over hundreds of billions of dollars in the US alone. Why are people spending so much money on the data? Because it informs a lot of things. I had a friend who worked for a very major online retailer, as the head of product. One of the things he said to me like struck out to me very strongly, which is he said, if I know four things about you, there is a very high, 90 plus percent probability that I can predict the fifth thing about you. Even if you don't know it, I know it just based on the data that they've been able to collect on past behaviors and how they're able to predict future behaviors; it informs a lot of these systems, including online retail apps, whatever it is, so that's exactly what I want people to be aware of is that my intent is that hopefully, not just my company, but an ecosystem of players. We all work together to help build what I call cyber hygiene, muscle memory in the family so that everybody knows when there's something new in the house. It's treated as a security incident and not just as a new toy in the house, that mindset we need to bring in, and I take hope in that that if you will stop somebody on the street 150 years back and ask them what the heck is an oil change? They would have stared back at you blankly, but in this day and age, even if you don't own a car, you know, what's an oil change? You know, what's an engine check light. New technologies, if there are ways of building in mechanisms where people's sense of awareness can be raised to potential dangers. People are willing to absorb it. I think that we need to do the same thing in the smart home.


Debbie Reynolds  10:06

That's fascinating. Thank you. So risk, to me, there's a scale in risk, there's high risk, low risk, medium risk. and all of that, to me needs to definitely be more transparent. So when I hear, I call it the AI risk drift, where maybe companies have early successes and do data collection or some sort of thing in a low-risk way, and it's successful. Then, they started going through these higher-risk use cases. When I think of this, let's say, your smart home, you have multiple devices that not only are they working and connect to the Internet, they're probably communicating with one another, if people don't even think about that, but there are some low risk uses a reasons why you would want to be able to have these devices in your home and probably have to do things that you think are okay. So let's say for instance, your smart refrigerator lets you know when your water filter needs to be changed. Maybe it's annoying, but it's a low-risk type of thing; then we get into these higher-risk things where they're like, hey, Vikram keeps his heat up really high. He's probably like a bad energy user. So we're going to raise this bill, right? That's more of a high risk. So tell me a little bit about this, especially when you're getting into these higher-risk areas. That's where the problems come.


Vikram Venkatasubramanian  11:33

There's utility value, for sure, but also, how is your data has been exploited to exploit you, right? This is not a new concept and there's actually a lot of coverage in academia and popular media around what's called prime vulnerability moments. It's actually a capability used by marketers, so we all have kinds of prime vulnerability moments multiple times a day. If I'm hungry, that's a prime vulnerability moment; when I wake up, it's prime vulnerability, or the moment all of these essential functions that I'll call them are the moments where we have specific needs, and we can be tipped in one direction or the other. That has been studied in a lot of detail. The amount of data that we collect has helped inform marketers exactly when those specific prime vulnerability moments are. Here's a simple example where we seem to say you typically eat at a certain time of day, and they have the data and say you're in a car, you're driving through some town at that point in time, to show you a coupon to a specific kind of restaurant, at that very moment in your GPS app that is exploiting a primary mode. It's not by accident, it is by design, and that is, I think, what people need to be aware of, and there are examples of this being exploited in Australia. Some years back, an insurance company got sued because they were using Internet search data to inform their insurance rates. Thankfully, they were able to sue and win against that specific company because the laws prevented that. But in the absence of any such protections, what happens in those countries, where there are no such protections or where there is no transparency around what kind of data the insurance company is getting exposed to that essentially is the risk behind a lot of this. So a lot of people are not realizing this, but you are already paying a penalty of some kind, you may be getting a $5 coupon, but you might be paying a penalty in terms of price. Another example where people may have seen this already is you search for flights, and now your rental car rates or hotel rates have gone up? How is that possible? It's because the information is being shared. There is collaboration amongst all of these entities, etc. So that's the sense of risk and awareness that we need to bring into people's mindset. But the other thing that I think is relevant in this conversation, you hit it spot on is, there is no such thing as security. It's a risk reduction game; you're not protected, you can't be 100% protected, it's a risk reduction game. So I think that's also something that it's harder to convey to a consumer mindset. But it behooves all of us to be able to convey that and say, okay, sure, you may not be able to stop your data from getting everywhere, but right now, it's going to 3000 different companies and how about you reduce that to maybe 100? Your risk profile goes down, and hence the larger profile of risk also goes down.


Debbie Reynolds  14:28

Yeah, I think you're right, I think we need to be realistic. So minimizing risks, I think it's the most realistic thing, can really eliminate the risk. In the wilderness with no cell phone or no connectivity or anything, perhaps that's about the lowest risk. Yeah, that may not be realistic for sure. I want to talk a bit about our project that we work together on. So it's the IEEE Cybersecurity for Next-Generation Connectivity Systems, and our group is called the Human Centricity Data Control and Data Flow in Context of a Person. So in our project, we've met some interesting milestones thus far. First of all, we created a questionnaire, which is called a data control impact assessment questionnaire. It is for any organization, whether they're implementing or manufacturing any type of IoT or connected system, or maybe a company has already implemented some type of connected system, and they want to do a gut check to see where they fall. In terms of human centricity, I think human centricityis important for a couple of reasons. One is that you should care about the humans; those are the people who are going to be buying your devices or using your services. But then also, we're seeing a lot more laws and regulations come in to try to make sure that humans are not harmed as a result of this. So I think because that regulatory landscape is changing, is really tightening up on companies around how their products are being used, and trying to prevent or minimize the harm and also create that transparency. I think that's pretty interesting. But tell me a little bit about this, and then we should talk about the next thing that we're working on in this project.


Vikram Venkatasubramanian  16:22

Yeah, so the question I had was, exercise is meant for self-introspection. I don't think any of us claim that it's perfect or comprehensive. But I think the sheer number of areas that we managed to collect and hit as a group, I think makes it a very valuable continuous self-assessment tool. I think it is very, very, very valuable. What we would want is more feedback from companies and individuals who do use it for the introspection process, to come back to us as a group and say, here's where it was useful. Here's where it was not useful, etc. The second thing it does is it gives us a framework to think about the privacy problem as product designers as well, multi-layered in terms of who targets us, we do have examples of how a CISO would use it, or how a Chief Privacy Officer would use it, or how a Product Manager would use it, as well. It's meant to be for various levels of the organization and how they think about the problem and for their specific realm of in our sphere of influence, which I also think is super, super valuable.


Debbie Reynolds  17:30

Yeah, the thing I think is unique, tell me if I'm wrong, maybe I'm biased in this regard. But we went out of our way to make sure that this was about data and not about regulation. That is good. Because all companies have data, and not all companies have to comply with certain regulations. So I feel like regulation is such a narrow path, I feel we wanted this questionnaire to be as broad as possible for any type of company that's dealing with data.


Vikram Venkatasubramanian  18:01

Absolutely and I didn't even realize it until you said that nowhere in the questionnaire do we reference any of the privacy laws, either within the US or outside number one, number two, coming to think of it that way, it gives you the broadest framework by which you can possibly even adhere to a hodgepodge of laws, because right now in the US, it's State by State; it's a hodgepodge. As a product designer, wearing that hat myself, I'm so confused. What does this mean for my product to be in the State, etc? What could happen in one of the other States that doesn't have a law today, and might have alaw in the future? However, if I self-introspect through this questionnaire through this lens, it does give me a very broad swath of compliance options and actions that I can take, which is actually very true. Yep. I didn't even see that until you said that just now.


Debbie Reynolds  18:49

Well, I was purposeful. I wanted to be as useful as possible.


Vikram Venkatasubramanian  18:53

Yeah.


Debbie Reynolds  18:54

Trying to look at it from a legal lens doesn't really get to the bottom of the problem, which is data.


Vikram Venkatasubramanian  19:00

Yep.


Debbie Reynolds  19:01

Our next thing that we're working on, first of all, this group is open to anyone who wants to participate. If you want to participate, reach out to me, reach out to Vikram I can get you into the flow to get you into this project. We have some really fascinating collaborators from around the world, really great thinkers and I love the fact that we're all different. So I think it helps us think about these problems in a multi-dimensional way. But also we're trying to create, right now we're calling it a privacy label. I'm not sure what we're going to call it in the future. But it's basically a better way to be able to assess these types of connected systems and I want to review, I think, one of the complications that we have with connected systems is that connected systems change over time, right? So I don't think it's something where you'd have quote, unquote, a static label on it. Say, hey, this toaster does x, right?


Vikram Venkatasubramanian  19:58

Yep.


Debbie Reynolds  19:58

So you plug it in, and it's connected to the Internet, it's getting over the air updates, the hardware may have capability to do things it wasn't doing when you bought it. But then as it's being updated, it can do different things. So it may be doing different collections or different things. So tell me your thoughts about our path or our privacy label or privacy, whatever we want to call it.


Vikram Venkatasubramanian  20:25

Fundamentally, the subgroup is human centricity. What we did with introspections is give human centricity to the companies that are making the products. How would you, as a designer or Chief Privacy Officer, think about the privacy of the end consumer that you're making this product for? The second piece of this is giving the end consumer the visibility or an easy way to interpret what they're getting in terms of privacy and security as a result, this is the part that I'm super, super, super excited about. I know there's already been work in academia around privacy nutrition labels, but there are still some gaps and I think that's what we are trying to address as a group. But really, it's about, hey, if I'm buying an app, or a device, or I'm signing up for a new online service, in one simple, easily readable snapshot, how do I understand what my risks are? That's really what this is leaning towards exactly like, what they're on food labels, this is now getting mainstream, thanks to the Cyber Trustmark program, the Cyber Trustmark program talks about the security of the apps, devices, etc, that you will bring into your home, but it does not address privacy, and that's the other piece of it that we have bring into the equation. So what if every company would be able to provide such a label? How do we enable standardization groups to take the work we've done and then run with it, bring it into standards, and help companies adhere to those standards? That's the intent here. This is interesting because we want more people here, not just because we want privacy people speaking about this, but your non-privacy people also participating in providing the feedback because it gives us a window of opportunity to review this before making any kind of an online purchase.


Debbie Reynolds  22:12

Yeah, I think the interesting thing about what we're trying to do, all of us are egghead people, and we do love to have people in the group that are not necessarily in privacy, not necessarily security, just people who care about this topic that can give some feedback.


Vikram Venkatasubramanian  22:28

Yeah.


Debbie Reynolds  22:28

But then also we're not replicating or duplicating work that other groups are doing. I know that people have heard people talking about privacy labels, and there are other organizations that have implemented some things like that. But I think we're looking at the gaps here, and doing that for us, because we're focused on data, we can do it in a more broad way.


Vikram Venkatasubramanian  22:50

Yep, I completely agree. I can't wait for a group for April; perfect or not, I think getting something out there is going to stir up so much debate that some time from now, six months, two years, or five years, I think this could be the basis for something really impactful in how people think and buy products.


Debbie Reynolds  23:10

I agree with that. Well, I can't let you off of the show without talking a little bit about your panel that you did at the Consumer Electronics Show in 2024. So tell me about this. Tell me how that went.


Vikram Venkatasubramanian  23:22

Oh, this was fascinating. The full video of the panel is available at Parks Associates, the analyst firm that pulled us together has made this available. The president of Parks Associates has been very gracious; she also said she was fascinated by the work that we're doing as a group in IEEE and has offered to give it a little bit more evangelization and see if we could get more privacy people involved, etc. So shout out to her, thanks to her. But the final was composed of a very fascinating group of people all the way from people thinking about privacy at the chip level, which was fascinating to me, and also a nonprofit group that is looking to take privacy standards and bring it into our enabling enterprises to comply with them. I literally I will say this without any shame. I walked onto the panel, I don't think I contributed as much as I got out of it. There were whole new ways of thinking about the privacy problem and whole new ways of thinking about it at various layers of the technology stack as well. That opened my mind. What is also fascinating about that was it was at CES, but it was in a sub-conference that focused purely on the smart home, and it opened my mind to the whole ecosystem of new technologies and offerings that are coming up around the smart home. Smart home as a managed service, a market that I was not even aware of. Prior to that, it opened up a whole bunch more questions. You get some answers but you get a million more questions is really what happened. How are these people who are offering these kinds of entire stacks, including the management of it, thinking about privacy? Are they seeing indicators of interest directly from the consumer in terms of willingness to pay, etcetera? What is fascinating is that there are companies actually saying, yes, more and more customers are asking about this unaddressed problem and they are willing to pay for greater privacy care and greater security across all of the devices; we just need to have more companies from that space also play in this, think about an entire ecosystem where the devices, the humans, the appliances, the apps, and the services all work together, not just to provide convenience and physical security, but also the cybersecurity and the related effects like protection against identity theft, all that kind of stuff, it's just going to be a fascinating space. I think, honestly, this is the next multi-trillion dollar industry, because if every home was going to go this way, this is the only way it's going to work for all of these companies in the space to like work together, along with the standards organizations, that are also in attendance and working to adhere to that. To me, this was one of the most fascinating experiences, it opened my eyes to all new markets and all new problem sets, as well as solution mindset, all the way from the hardware level. Fascinating.


Debbie Reynolds  26:15

That's great, I was really excited that you were on the panel and at the show. It makes me feel good when I hear or know really smart people like you that really trust but are really able to get on those international stages and show your influence and skill. What's happening in the world right now that's concerning you around privacy?


Vikram Venkatasubramanian  26:39

I think that is a lot of talk about AI. Everybody's scared of AI, suddenly, we had a Congressional hearing around that, etc. But what makes it powerful is the data. Out here in Boston, in Cambridge, the MIT Museum has a copy of the very first paper on AI; that paper is a year older than me. 1972, existed for 51 years. So AI is not the problem. It's the data that makes it powerful, and of course, there's now more computing power that can be thrown at it as well, which is making it even more powerful and faster. I get that. The big question is, why are people not concerned about the amount of data that's being exfiltrated? Why are there no boundaries and curbs on it? What worried me the most when I listened to that Congressional hearing was the questions that are being asked of the companies that are responsible for a lot of this were so uninformed and I take a very negative view on that in that I don't think the people asking the questions aren't smart? I think they're smart. But it worries me that they aren't choosing to ask smart questions, asking the question, why is that is really what worries me the most? Because it gives me less hope of any kind of governmental protection from the law. What are the negative implications of all this data being fed into all kinds of AI engines without any control? Case in point, I mean, the Irish Civil Liberties Union published a paper jointly with, I believe, Crack Labs about a month or so ago, talking about what is the biggest data breach in America. Said it's actually happening every day, they talked about how much foreign agencies and they just focus on one small piece, demand side platforms of ad networks, and how foreign agencies are collecting very sensitive information about America, for example, they would have specific codes to identify people who work in the military or families that have somebody in the military, or even if they work for a military company that supplies equipment, or whatever, to the military. They're able to track specific pieces of information about them and this is very sensitive, right? Who knows if there's somebody who works for, say, one of the big defense firms, and their family has somebody in the family with some kind of a health condition that is sensitive information. That's actually being traded 1000s of times a year through these ad networks. They literally tracked a lot of metrics around that and put out a report, and they computed that it's happening 1000s of times a day for every single person. So if you think about that, and you're only focusing on one specific problem, which is demand-side platform, and expand that to everything else, the sheer volume of data about us is staggering, that it's getting fed into these AI engines. That, I think, is what needs a lot of regulation in terms of transparency of who gets access to the data as having control. I'd say in terms of who has access, I think it's an easy choice to make to say countries that don't have American interests in the forefront, should not have access to the data. That's an easy, illustrated choice that can and should be made. But also beyond that, I think as a personal user, I should have a choice in terms of controlling even who should get access to the data and putting some guardrails around it is really where the conversation should be going and right now, it simply is not, and that worries me a lot.


Debbie Reynolds  29:52

Yeah, I'm worried about that a lot and I feel like in a lot of conversations, especially around either ownership or control over a person's data, sometimes people talk about as though it’s impossible. So if it were impossible, then you could not sell it.


Vikram Venkatasubramanian  30:11

Yep.


Debbie Reynolds  30:12

So the fact that you were able to package up this information and sell it tells me that you can control it. Because you determined that based on the highest bidder, you're going to package this information up and give it someone who's going to use it.


Vikram Venkatasubramanian  30:25

Yep, absolutely. Absolutely.


Debbie Reynolds  30:28

Yeah, so, if it were the world according to you, Vikram, and we did everything you said, what is your wish for privacy anywhere in the world, whether that be regulation, human behavior, or technology?


Vikram Venkatasubramanian  30:44

My wish would be that everybody knows how to read and interpret the engine check light when it comes on for privacy, and for a system to exist that reliably provides such an engine check light. So when I make my Internet choices or members of my family make choices on their Internet habits, as part of their daily living through the day, when they cross certain thresholds, that is some kind of a reliable system that provides a warning and a flashing that says, you may be treading into or now transacting with an entity which may not have your privacy as the best interest. I think bringing that in is essential because here's where I'll step back a bit. It's easy to have protocols at the device level; it's easy to have apps that can be regulated or controlled in some fashion, but you can't secure them. We found this in enterprise cybersecurity. Cybersecurity is a combination of tools, people, and processes, and we can't do the same in the home without involving the people. So we do need to have a system that actively involves people back to the human centricity of our group and for those people to be sufficiently educated so that they can make those informed choices for themselves. I want to be able to see and contribute to. Eventually, real-time systems that sit right by you, providing you advisory and providing you the guardrails so that you can make a choice and maybe if you want to jump the guardrails you're fully aware of when you do jump those guardrails.


Debbie Reynolds  32:16

That's a great dream. That's a great wish; we're moving toward that. We could definitely make that a reality. I highly recommend anyone who's interested in this project definitely reach out to me; reach out to Vikram, and we can get you guys in the mix. We have a lot of brilliant minds all over the world that are involved in this project, and I think it's just going to be a tremendous help not only to humans, who are the end consumer but there are also people who are anywhere in the supply chain that are trying to get products out to market and trying to remove barriers to adoption, in my view.


Vikram Venkatasubramanian  32:53

Absolutely.


Debbie Reynolds  32:54

Yeah. Well, thank you so much. This is so much fun. Thank you. Thank you.


Vikram Venkatasubramanian  32:58

Likewise, Debbie. Always an absolute honor to have been even invited on the podcast. So thanks for having me.


Debbie Reynolds  33:06

Well, you're big time, so I had to get you on here.


Vikram Venkatasubramanian  33:10

Thank you.


Debbie Reynolds  33:10

Great. Talk to you soon.


Vikram Venkatasubramanian  33:13

Thanks Debbie.


Debbie Reynolds  33:14

Okay, you're welcome.