"The Data Diva" Talks Privacy Podcast

The Data Diva E16 – Amar Kanagaraj and Debbie Reynolds

Debbie Reynolds Season 1 Episode 16

Send us a text

Debbie Reynolds "The Data Diva" talks to Amar Kanagaraj CEO of oneDPO.  We discuss Cultural differences between privacy in the U.S., Europe, and India, update on privacy in India, tackling the problem of complying with privacy regulations worldwide, privacy opportunities in India, far-reaching implications of Data Privacy regulations in the future, leveraging technology to address Data Privacy, privacy engineering and privacy leaks, identity and biometrics, data access audits, data scraping, and his wish for data privacy in the future.



Support the show

Data Diva Amar Kanagaraj

41:44

SUMMARY KEYWORDS

privacy, data, technology, India, companies, people, biometric, big, downside risk, person, problem, talk, business, case, implication, protect, probability, happening, point, risk

SPEAKERS

Debbie Reynolds, Amar Kanagaraj

 

Debbie Reynolds  00:00

The personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. This is "The Data Diva Talks Privacy" Podcast, where we discuss Data Privacy issues or business leaders around the world to talk about issues that businesses need to know now. Today, I'm very happy to have a special guest. He's currently in India right now. Mr. Amar Kanagaraj is the founder and CEO of oneDPO. And he works with companies on things like data breach and privacy risks. Interesting. So you called me up, you contacted me up on LinkedIn. I think he has seen something I had written or something I had done.

 

Amar Kanagaraj  00:55

I listened to one of your podcasts on that in the past, and I was like, man, then I could watch a couple of other podcasts. Dude, I should get it connected. So I just pinged you. That's how we got connected.

 

Debbie Reynolds  01:08

Yeah, so you pinged me, and we had, we talked for about an hour, I guess. Yeah. And I was fascinated kind of with your background in technology and how you became interested in privacy. And the stuff that you're really doing now, you're very passionate about privacy and kind of why, why this is the issue that you want to solve, you know. So I would love for you to, you know, almost like our first conversation. So I also told you that I almost wish we had recorded that because we've talked about so many different things. We had a really far-ranging conversation. But I would love for you to talk about sort of your background and trajectory and how you came to privacy as something that you wanted to solve as a problem.

 

Amar Kanagaraj  01:58

Sure. I'm originally from India. I did my engineering in India then moved to the US for my Master's in Computer |Science. I went to LSU, a big football school. Right before LSU, I didn't know what football was. But I thought soccer was football. So after graduating from LSU, I joined Sun Microsystems. It was the dot.com era. It was a very exciting place—a lot of innovative things happening on Sun. After three, four years at Sun, I wanted to learn the business. So I went to Carnegie Mellon for my MBA. Right after graduating from Carnegie Mellon, I joined as a management consultant. I worked in and around the Midwest, Cleveland, Pittsburgh, Chicago. We worked in many different industries. But one thing I really really missed was technology, right? I really wanted to work in a technology space. So I left to manifest something and joined Microsoft as a product manager. At that point, Microsoft supply was planning the brand launch, and the slide was planned to launch Bing Radley, you get an opportunity to work for them, launching a brand that to such a big brand like Bing. So I joined the team. It was a good run. We were almost nobody in the space. And we grew to 20 plus points of share in the US along the way. That's where I started. One of the projects I did was being for schools may be being said, we will not track data or show ads from school searches from schools. I was not the product manager who worked on that proposal initiative. That's when we looked at how we can protect our privacy, how, what are the business implication things like those, that is when first I started getting exposure to Data Privacy, and different aspects of Data Privacy. Then right after Microsoft, I always wanted to start a company. So my friends started a company called FileCloud. And I joined them as a co-founder. I was leading the marketing, for marketing for FileCloud. Around 2017 is when I first heard about GDPR when our customers came to us and asked, hey, do you support GDPR? How are you going to help us meet GDPR then instead of searching what is GDPR? Right, so that's when I first read. I was like, man; this is big. GDPR totally changed or shifted the power from the businesses back to the consumers. Prior to GDPR, companies may not tell how they're going to use the data or even tell whatever they're collecting. How long does it keep it? Now, with GDPR, they have to be accountable now. Right? The power shifts back to the consumer. This has a big implication on how companies deal with consumer data, what data they collect how they deal with privacy inside the company. So this is where I think technology can play a big role, right? We can help these companies achieve Data Privacy; hence, I left FileCloud, and I started oneDPO. At oneDPO, our thesis is simple, right? Companies have a lot of data, much more data than I think they do. At least 100 times more data, right? They've got this complex environment; finding and fixing breaches and privacy issues is hard. One incident can wipe out your years of brand equity, financial loss, loss of trust many other things. Your private company's policies and processes are not going to stop it. Right. They can, they can control it, but they can't prevent it. So the technology to do this. So that's where we bring technology and privacy engineering inside to help companies identify breach risk, privacy risk and help companies address them early before it turns into something big. And that can pose a big risk to the company that that is a quick background of how I got exposed to data privacy, how we ended up starting a company in privacy.

 

Debbie Reynolds  07:02

Yeah, that's fascinating. It'll be great if you can just give me your perspective. So you've been in the US, you know, you're in India, you've seen how people are handling privacy globally? Like, what do you see now that maybe it's surprising you that you maybe hadn't anticipated?

 

Amar Kanagaraj  07:23

Very interesting question. The thing is, I'm glad you drop into the mix as well because I was while I was doing my MBA for a semester, I was in Germany, as an exchange program, the concept of privacy was very different in India versus the US versus Europe, right. Culturally, growing up in India, privacy was it's not a, you know, something that people talked about, or something has been in the public domain, like news or anything, right. The US was, we talk about privacy. But when I went to Europe, some of the things I heard how Data Privacy is important, given the fact that some of the data they collected through the census was used in World War Two for discriminating and doing really bad things. So how careful they are with the data, their cultural inclination about privacy against data? Some of these are privacy as a human right, right? Right. That was something very fascinating when it comes to Europe. The US has some privacy, some that a freedom of speech kind of thing. Right, but not to the extent Europe has, and India was not even thinking about it till recently. But things change, right. In the last two years, I've been in around of different countries, things have changed. So what has changed in India, when I come back and see in around 2017 is when there are many sequences of events, what it led to was, finally India made a Constitution change and said, hey, privacy is a fundamental right. So every citizen has the right to have privacy. So that is a shift. Now, privacy becomes your right. And that has implications, right? Legal implications, as well as many different facets. That is something that is fascinating. And I find it very interesting, and that's happening in India. And there is a sequence of things that are happening in India that makes India a very interesting space, though, Europe from a business point of view, and from a maturity point of view, Europe is kind of way ahead leading the charge, and the rest of the countries are following, but India's close, closely catching up, right because the latest bill that is in the description is new I would say based on GDPR, but it doesn't it's a subset of GDPR. Right? Some of the things we don't like, which is a big no, no, in GDPR, are part of the bill. For example, governments have a certain authority. So they have a right to access some of your data versus corporate that makes a distinction, right. But GDPR is all saying, but you know, in India, that is something which it wasn't that, but that's what the bill is. But at least it's one step closer towards something protecting Data Privacy. So it's fascinating to see the change, right from when I grew up, this is what's happening now. Things are moving. It's for me, it's fascinating.

 

Debbie Reynolds  10:47

Yeah, I'm fascinated with India. I did a video a while back about India, and I talk about it, and I can because there's, you just need to keep your eyes on and do what you put fit. The things are happening here right now, I think, especially in the US. I don't know that a lot of people are really paying attention. So this is a high level. I'll just talk about India and why it's important that we kind of take a look at what's happening there. So India is the biggest democracy in the world, right? There are more people in India than are in the US and Europe combined. Facebook's biggest audience is in India; there's a lot of investment going into India right now. And a lot of especially tech companies are very interested in your Data Privacy deal, obviously, you know, because they want it to benefit them in some way. But because I feel like India has a really big opportunity right now to really institute something in your privacy bill, this, you know, very important, and it may influence other countries to sort of following along. So I remember when India put privacy as a fundamental human right in your Constitution a couple of years ago, and I was very happy to see that. And that's something I'm really pushing for here in the US.

 

Amar Kanagaraj  12:09

Yeah, and you said it right. India's really a large market, right, and we are talking about billion-plus people. Now most of cell phone usage, the rate at which the broadband is penetrating, right. Everything shows India's to be one of them, will be the top market for many of these players, these global players like Google, Facebook, everybody. So the privacy becomes front and center for it right. Because now we are dealing with, when you know, one thing that is about collecting data, and something about using the data, right? So if you can use the data to swing elections to the point where you can swing elections, now we are talking, you have an impact on the democracy itself. Now, the power this data has over governments, individual citizens, economy, it is it has far-reaching implications. So data is extremely powerful. So it's good that countries realize it. And I feel the same way. Like, you know, this is a generational problem, you know, and this is like climate change kind of a thing, where if you don't act at the right time, this will be something huge. And now we will be fighting up when it's really, really too late. At least now we have the technology. We can grow the privacy along with privacy technology along with the rest of the technologies and catch up. That's what I feel about privacy. Yeah.

 

Debbie Reynolds  13:43

So you said something to me about why we need technology to help solve this problem. So I would love for you to talk a bit about that. I think that's I agree with that, by the way. But yeah, so tell me what you think about that.

 

Amar Kanagaraj  13:59

So if you think about, you know, the quantity of the data, so the three ways people say right, the volume, variety and the velocity of data, right, because we are collecting more data, the variety of data, right? We are collecting from devices, IoT devices, with the cameras, the mobile phones, the variety of data that's flowing in the formats that are flowing in, the right GPS location in all kinds of things, and the velocity because we are now each moving car cellphone devices generating 1000s of data points in a given second a minute. So all this increases the velocity of zero at what rate we are getting this data. This is a huge technical problem, right? If you take a traditional approach to tackle privacy, you know, if it was 20, 30 years, if we were talking about compliance, yes, you can do some processes, and you can tackle training. Still, be processes and training that's given. But without the technology, right, it's very hard to manage the state of massive volume and variety and velocity of data. So I'll give an example I was talking to one of the executives. He was leading the privacy in one of the large companies, say Fortune 500 companies. And he said, you know, imagine how many databases we have. And I said, you probably have 10,000, he said, no 100,000 databases, not even ten tables, databases inside the company, leave alone the files. And this is all in the cloud. And don't even know, don't even ask about things. No, it's not in the cloud. Those are the only things I can track. I don't know how many things I do have in class. I can't even track. So think of the amount of data that these companies have to deal with, you know, and they have built a business around it. So if we're a night, we cannot go and say, you know, stop your business. That's not fair for the employees, the customers. And so we have to provide the technology that is going to help them to run their business with the right guards and parameters, right, that will help us protect the data while they run the business. Right? That's when privacy technology becomes a critical part, right? Privacy technology. When I talk about privacy technology, it may not be like the cutting edge. Once you know it, it could work, but the technology. What we have to think about is how we can apply technology at scale, right? It's not about fixing one thing, but how about addressing the data, the volume of data we already have, right? Already, we have so much data. What do you do with that? Right? A lot of technology. Third, talk about the new data we get. We can do this, but what do you do with the data we already collected? Right? So addressing all these things requires technology, right? That's where the companies, new startups who are trying to break into this. It's fascinating to see how many companies are coming in attacking this problem, which is a good thing. Right? Eventually, we can crack the problem.

 

Debbie Reynolds  17:14

Yeah, we talked a bit in our last conversation just about Silicon Valley and what you're hearing and seeing from your colleagues, your contacts, and in Silicon Valley that are working in these big corporations that we see in the news every day about privacy. You know, what is your sense about the people who are working on privacy and the engineering space? In terms of sure, we know what is, what are they thinking, what are they feeling right now?

 

Amar Kanagaraj  17:47

That being a privacy tech and privacy engineering company, we talked to many large companies, medium and large companies, we're solving privacy. The company's who are on the leading edge like Google, Facebook, and Microsoft, it is fascinating to see the technology they are there, how they are addressing this problem, right? I was given, for example, a couple of examples, right? So one of the companies, they were talking about how this whole division of privacy engineering is coming up, I had to give that context. So what is privacy engineering is something like, how do you approach the privacy problem is seen as a legal problem now, right? Because most of the places where the privacy sits an organization are under legal, right? Rightfully so because we are dealing with law, we are trading with compliance, we are dealing with conflict of different laws and contractual agreements, in many things. So it's likely under legal, but there are new or growing areas of privacy engineering. Privacy engineering is kind of a broad area where you apply engineering techniques, right? To tackle privacy. So some of the things, for example, there are metrics to track privacy, right? You can measure how much privacy leak is happening. So, for example, it can be boring. If you say, I'm going to summarize all the salary data in an organization. What if there is only one person in the 40-year-old range? And you summarize it by age? and easily somebody can point out that person's income because that's the only person in the 40 range and or trade or 20 range, right? So that's a privacy leak. How do you avoid privacy leaks? It's okay to catch in a small data set when you have to deal with large datasets. How do you deal with it? So the people are talking about measuring privacy and also taking to the next level of if we have to share and work on the data, how we can trade off some of the accuracies in the data for privacy. For example, In this case of let's take the salary, if, what if we drop a data point that is not doesn't have like the person who was 40 years old, that dropped that person's salary from their home summary, or add a point, add two, three data points. Now what has happened, your data outcome is not accurate, right, but you're protected. So there's a trade-off. I'm not saying that this is exactly to start dropping data points or adding noise to it. But that is one way you can think about Data Privacy, how you can reduce some of the accuracies some of the business benefits while protecting privacy. So think of having your own privacy budget or something I can I have so much data? I can if I continue to do what I do, this much privacy at risk? How can I reduce the risk by me losing some data or making the data less accurate? You know, things like that. I can go on but don't have one more example and give them property and stop there. So, for example, GPS data, right? If I can pinpoint exactly to a home, or I can make it less accurate and make it pinpoint to a city or a zipcode, or a street, right? Yes, I lose the accuracy of exactly knowing that the customer was when they made a purchase or when they took a car ride or when they called so, but in the bigger scheme of things, you're protecting privacy. So we are trading off some accuracy, some business benefits for the benefit of greater good privacy. That doesn't make sense.

 

Debbie Reynolds  21:51

Yeah, I love it. Wow. So that's really an excellent point about accuracy. So I guess, to me, that's a knife that slices both ways. So in your example, about the GPS data, that makes perfect sense, where in order to protect people's privacy, companies are trying not to give data so granular that you can identify the individual within datasets. But then the other thing that I think about this is more in a legal sense. So let's say data is being collected for a legal matter. And especially if you're collecting data from people in countries that have privacy laws, the stronger privacy laws, you'll probably collect less data, and they make sense to the collection of fewer data. So in that way, if your data set is not accurate or not full, you may make different conclusions that may not be true.

 

Amar Kanagaraj  22:50

That's true, that's true.

 

Debbie Reynolds  22:52

So that's just a whole different issue that we can definitely talk about. So that's fascinating. I love it. So one thing I would like to talk about is identity, like identity systems or biometrics. So biometrics is huge. I mean, this is something that I've been, you know, tracking for a long time. I see a lot of activity and identity management space. Biometrics is interesting was also frightening at the same time. So I feel like the technology is getting ahead way ahead of sort of the law and regulation. So what are your thoughts about that?

 

Amar Kanagaraj  23:32

So, identity can come from two different angles to that one is from a business enterprise point of view, identity is kind of something we deal with how you is for security reasons, how do you identify the employee is that the person who will say he is or she is what authority authorization they have? And are they rightly using that? Or are they posing a risk? These kinds of things all stem from identity. Identity therefrom enterprise point of view, is a complex area, you know, because a lot more time companies give up this broader access to many people, and many privacy in breaches just happen because too many people have access to that, right? Or they didn't even protect it enough that nobody, because reviewing this auditing, this is a complex task, right? I was talking to somebody, and they say, every year, she allocates one month of her time just to do this audit, just to look at what data each person has access to do they need it or not, right? And they do it every year. That is kind of scary, in some sense, because data changes, new data comes in. At least this company does it well. Many companies take identity, and just they don't even review it. And it's always there and putting them first, right? This is something that has to change. It's changing, and privacy is especially kind of pushing you to change. Security. As far as you're an employee, you're outside, you're not outside, you're okay, I have some risk, but unless you have access to privacy is different. Now, you could be an employee, but there is no reason for you to look at the employment record, right? You don't even have a right to look at the parts of customer data that you shouldn't be right, though, like the rain data or their purchase history, whatever, right? So it is getting tighter and tighter. I just identified from an outside interface angle from a biometric angle. And one that really shook me a little bit was this whole idea of there are many organizations government organization already do this, right? There's a company called Clearview. I think what they do is they do facial recognition because they use all the public data to process your surveillance cameras and stuff. This is now we it's it is the oldest public data; still, it is on a slippery slope, right? You know, now the law enforcement is going to use it. Now, if you think of countries where you are areas where you are, you do not have freedom of speech, are you are fighting against discrimination, or you're fighting against a bigger movement against the government? Think of wallet, if you have a technology like these, right? Will it will stop even if it stops a bit of this participation? That isn't a big implication to society as a whole, right? I wanted to participate. But because this government has surveillance, I'm not going to show up, then that is something because people are scared, you know, you should address them. I mean, how many people are really, really bold enough to give up everything to fight, right? And people are scared and convincing government in many parts of the world. Now, if you give the technology where they can identify you, through surveillance cameras, or your biometric, which in many different ways, then it is extremely risky or extremely scary, it opens up the society where people are not going to exercise their freedom of speech and things like that because they're worried about their getting identified. So that is something it's a bigger problem, the government should have a proper, that's called the government a part of the Federal law or something, they can restrict how to go even go for sure how corporates use data, especially identity and biometric kind of data, and how governments can restrict and the government agencies using those right, because if it impacts another fundamental right of freedom of speech, that kind of things, then it is not okay to for them to use natural systems and technologies. And that's my personal point of view. It's going to be a catch-up because technologies keep growing. And policy should also catch up is what I think

 

Debbie Reynolds  28:15

So you just brought up an interesting point, Clearview AI has a case pending related to the Biometric Privacy Act in Illinois. And they actually lost an argument recently where they were saying that the company has the First Amendment right to collect this data about people, does it? No, you don't. So this case is going. It's still going through the courts is probably going to be a big deal. Well, it is going to be a big deal. Because Illinois has the strongest biometric law in the world right now, they have a private right of action. Facebook recently settled a case there for $650 million. So this is gonna be very interesting to see how this goes because this is a data scraping case. But the reason I brought this up, and I'm glad you brought this up because you're the perfect person to talk to about this issue. So I have issues technical issues that I want to talk about. So when you're, let's say, when you search for something on Google, for example, and it brings back all these results, you know, says you know, you have a million results in three seconds or something like that. There really are a million results, okay for what you're looking for. So if you keep going down the page and down listing, the more you go down, the less relevant data is to what you're doing. But if Google had created their search engine where if you didn't have an exact match, then they just gave you nothing, no one will use it, right? They will say, oh, this thing doesn't work, right. So this is an issue with biometric databases, okay. So I feel like people are developing these databases. They want to make it look as accurate as possible, right. So even if it's not an exact match, they want to bring up some results, which is a problem. So, one thing that I'm seeing that they're doing, and this has happened to other databases too, is where, let's say, for instance, there's a record that doesn't exactly match what you want it to match. So people may add additional information to it to make sure that it comes up in a search. So, for example, is an example of a biometric database, let's say, a company may have an image of some of maybe two people, they don't have a complete image of these people, but they put them they put those images together as almost like a composite so that someone searches for this thing, or this person, those two records, or that record will come up, even though it's not an actual person. So the danger is, you're putting out these databases, where you're not sure about the accuracy of the information, but you have people using it as if it is accurate, and then they're taking action against people. So what are your thoughts about that?

 

Amar Kanagaraj  31:33

A very interesting topic. So the thing is, anything you do mathematical grade, we are talking about probabilities, right? You know, I cannot exactly look at your side view of your face or a side view of somebody walking by are distinct shots of a face. It is a probability, right? Because I have seen similar photos and samples, and I have trained my model to spit out that, hey, based on what I see, there is a high probability, right? The keyword is there is a probability, right? So now you can make it the same thing as Google, right? It's saying it doesn't know the fastest, the most accurate. It's gonna answer you. And there's a high probability, right? This is going to answer you. And there is going down the path that may not be, but you might find third or fourth link, probably the better answer than the first because that's a question you were looking for. But in the case of criminal cases, or in case of this kind of public safety kind of thing. You can do this very hard to really go by a probability very, and it can. But how do you keep that threshold? What is the impact of having a false positive, right? If somebody vandalized a store, and I look at a security camera, and I'm doing this, it is not clear, but I'm running these models to say lightly, this is the person right now. This is good. Now, how accurate is it? What is the probability? What is the likelihood? This is something wrong, and it is a false positive. Now the downside risk is huge. In the case of Google, you can just crawl to the next page. Next page, you're fine. You don't lose probably lose few seconds of time, right? Complex. But in this case, it has an implication. You might get somebody into trouble or you and you might unknown something that made bad damage, his reputation, or even stress, right, you just call somebody I saw you, and you had to come to the court, or maybe that person was innocent, they finally go home fine. But the stress it costs the person the relationship around them is a big implication. So when people are using this kind of technology, and the people are using should ask the right questions, understand the technology's accuracy in there are what is the probability of getting false positives, right? So this, these are the things you first the company should be forthcoming. It's okay to say no, I don't have an answer, rather than giving an answer right to your right. Right that has, that's how the companies should also, I mean, organizations should also look at the technologies and the technology is not going to answer. It can give an answer. But it comes with certain caveats and certain contexts around. This is a big educational piece. That's the only way to do this. Because technology is not going to be always 100%, it's going to be probably very close to the person, right? Unless you give the exact picture and exact definition of things like that. Right.

 

Debbie Reynolds  34:46

There was a case that came up recently in the US where a guy was arrested. So like, police came to his house, they arrested him because they said that his image from his driver's license match that a video of someone stealing something from a store. And so they arrested him, took him to jail, they took him in front of a judge, and the judge said, this is clearly not the same person. You know, even people will arrest him, saying, well, he doesn't really look like this. But the computer told us yes, then it looks like him. So they arrested him. They eventually let him go. That's terrible. That cannot happen. Yes, it cannot happen at scale. Also, two things you mentioned, I'm so glad you said it. And it is I think technology is always good to do the heavy lifting. That is hard to do in humans, but computers or technology can replace the judgment of a human. And it also sometimes, when this data is being collected, is done out of context. So the technology doesn't add the context. So you're expecting contexts from technology, you're using it? What are your thoughts about that?

 

Amar Kanagaraj  35:59

So that's it, I would defer. And one thing, which is technology is the AI that is advancing fast, it can make really good solid decisions, because humans are we hold a lot of biases, yeah, knowingly or unknowingly, we have biases. So you might, you might take those biases, and you can see judgments to those types by those might creep in. But technology doesn't have those biases. But the problem is, when you have huge downside risk, right? If somebody is got into trouble in the law, right, for example, the theft or vandalism and you know, you cannot just go with just technology there, right? It is just there is an element of human judgment that has to come in because the downside risk right on the person's life matters. Whereas you're talking about fixing the price on a read, are you're talking about how much production should be there? Or how much should be a cost of something, these things in the machine can make a probabilistic approach. Yeah, there's a downside risk, but it's quantifiable, right? It's talking about 10 million 100 million dollars. He can make decisions. And you can go with that. Right? When the downside risk is pretty high. Yes, human judgment has to come in. Right. And especially I would say, on the public safety, law, technology can be an aid. But they, at the end of the day, rely on humans. Though they are not perfect, they are not. They carry their own biases. We were a bit off here. We have added correct checks and balances. And we have been tweaking the system. So I would rather continue to do that use the technologies need. When one continues to make judgments using technology and the web, the downside risk is quantifiable. Right, can be put in the number.

 

Debbie Reynolds  38:03

Right? That's a great answer. Thank you. Thank you for letting me geek out about this. This really concerns me when I look at this. So hopefully, I could talk with someone who understands these technical and legal issues here. So it was a world according to Amar, and you have a magic wand and can decide how privacy laws get, you know, enacted just around the world, in India, or anywhere around the world. What will be your wish?

 

Amar Kanagaraj  38:33

So oneness, I like Matt GDPR is coming from as a human rights lab, then I'm going to do it for the good of my business for the control of my government authority or whatever, right? It is coming from a good source, which is a human right. Think of privacy as a human right, and base your laws and enforcement around that. Right. And that is when you talk about private data. People think every risk is equal, right? I would think certain risks are huge, right? Some sensitive information leaking, versus some non-sensitive information leaking out there. There are big, big implications on those. So understanding, not seeing all data being equal, not always being equal, treat them differently. And make sure that you protect people from whatever it's physical harm, embarrassment, whatever it is, from a list that is really good, hard, right? Protect people against them then saying that I'm going to protect every data and every pray every person's privacy and not doing a good job of that rather than going after huge impacts the risks and data that can cause huge impact and go after them is what I would say as a technology as well as the last one. This is something I, that when I whenever, I talk to people one thing, I find it very hard to get through. Nothing wrong is lawyers have stuff, but people, they like to think on the edge cases they'd like to think on the coverage and talk the bits and nooks and corners and things like that. I'd rather spend time on those which are the ones that are going to cause a huge don't setlist for my data subject this particular data set of actions, go after them, protect them, then go work your way outside, all data has to be protected. But go after the high-risk ones, then go towards the low-risk ones is what I would say that the magic one I'll make the companies and countries understand this concept of there are huge risks. And there are many medium risk and low risk and how you think about data analysts differently.

 

Debbie Reynolds  40:57

That's wonderful. Thank you for that. Thank you for that. So all right. So we're coming to the end of our podcast. Thank you so much. This is fascinating. I know the listeners will really like this. I definitely enjoyed it because I am a technology geek and data really interests me. So you're allowing me to indulge myself.

 

Amar Kanagaraj  41:18

Thank you, Debbie. It was fun catching up. I allow one of the questions and the topics we talk we can talk for hours. We will catch up some other time to thank you really appreciate your time and podcast.

 

Debbie Reynolds  41:32

Thank you, and I'll talk to you soon.