Debbie Reynolds “The Data Diva” talks to Leila R. Golchehreh, Co-Founder & Co-CEO, Relyance AI. We discuss her background and what Relyance AI does in privacy and data governance, the difficulties of understanding the needs of large organizations, the proactive and the reactive parts of compliance, changes in products that create privacy havoc, and the impact of the lack of Data Privacy as a human right on people’s health and other rights after the Roe v. Wade and Dobbs decisions, the idea of “data as the new gold” has led to risk due to their overcollection and retention of data, the Amazon acquisition of One Medical and data retention and minimization of legacy data, and her hope for Data Privacy in the future.Support the show
data, privacy, organization, debbie, data processing, information, companies, people, processing, law, happening, technology, data protection, called, users, subpoena, data retention, understand, handling, building
Leila Golchehreh, Debbie Reynolds
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information businesses need to know now. I have a special guest on the show, Leila Golchehreh. She is the Co-founder and Co-CEO of Relyance AI. Welcome.
Leila Golchehreh 01:20
Thank you so much. Thanks for having me, Debbie. It's great to be here. I'm a big fan of your podcast.
Debbie Reynolds 01:24
Oh, thank you. Thank you. So this is great. So you and I have had other chats, and we thought this would be a great show to do very timely, with a lot of privacy issues that are happening in the US. But before we kind of dive in and sort of what's happening right now in the US, let's talk a bit about you, Relyance AI and what your company does.
Leila Golchehreh 01:50
Thanks, Debbie. Absolutely. So Relyance AI is a global Data Privacy and protection platform. We're really building the trust and governance infrastructure for the Internet. I've been working in data protection for over 15 years now. And, you know, largely building out privacy and data governance programs for companies of all sizes, whether they're small startups all the way to large, publicly traded organizations. And what I found repeatedly, Debbie is that we've had an observability problem. It's been very difficult for organizations to understand how information flows through their API, through their infrastructure, through their code, and in general, what the requirements are that are flowing from their contracts and the laws, and how those compare to that operational reality. And back in 2018, when I had a very short period of time to get a large organization ready for the GDPR, I demoed every privacy tech tool on the market. And what I found repeatedly was just two types of solutions. One solution was really focused on automating workflows. So it took the workflows that I was going through when I would go to engineers and ask them, you know, what are you guys doing with data? Who's getting access, and what's being done to it? And just automated the process of sending forms rather than actually finding answers to the questions that I had. That's what I was looking for. And I never could understand, Debbie, when engineers were programming in programming languages and writing code, why there wasn't a translation layer that could then explain to me as a data protection officer, what was happening with information across the organization, then the second solution that I really found was a database scan. So that would effectively scan the information that was laid out and in various databases. But as a DPO, I just thought that was way too much access to information, and it would certainly make our customers uncomfortable. And then the effectiveness of the solution was also sort of unclear. The results were kind of hit or miss. And so, with Relyance, what we are building is really the tool that I always wished that I have and could just never find. And so far, we've done incredibly well. This is actually our most successful quarter to date. We've landed wonderful customers, including Zoom, New Relic, five trim and others that we're really proud of. And it's actually going incredibly well as only a two-and-a-half-year-old company because we're finally solving the problems that the industry has been facing for a long time without good solutions by asking privacy questions to code.
Debbie Reynolds 04:37
That's great. Congratulations. So I would love to talk about the two things to me that happened in privacy. Tell me how you look at these issues. So one is the compliance, or what I consider like the reactive part. So like a law happens, passes, and then the company scrambles, says, oh, what do we have to do to comply with this law? Right? And then the other is more of a proactive, so you're constantly getting data from people, you're constantly doing different things with data, what types of things you think companies need to think about proactively as well when they're handling data, even before a lot comes out. So I feel like there are just two different camps here where I feel like a lot of companies feel like, okay, if I'm constantly reacting, that, I'll be fine. So I don't really like need a tool like this, or some people are like, oh, I'm just so proactive so that I don't need to worry about the reactor. So tell me about those two camps?
Leila Golchehreh 05:39
Yeah, this is a great question, Debbie. And I think it's been like playing Whack a Mole with the various laws popping up. It's either Whack a Mole or alphabet soup. I'm not sure. Something between the GDPR or the CCPA, the CPRA, we've got a Brazilian data protection law, lots happening across Asia, as well as South America with data protection developments. So how do we keep up? And I think, you know, Debbie, the best way that organizations can approach this is really by thinking about privacy from a first principles perspective. It is we're going to drive ourselves crazy trying to stay on top of each one of these different laws, understand all the nuances, and do a gap analysis between the GDPR and every other unit in your acronym law that's popping up. So our approach at Relyance and the way that our technology operates is to use this use first principles. What do all these laws have in common? And effectively, they have three things in common. All of them want organizations to understand what information they are processing. And first and foremost, you have to know where we're starting to do that data inventory. Number two, they have to understand who is getting access to that information. So not only are internal facing evaluations of which teams within your organization are getting access to information but also which third parties are getting access to information. So that's the data map. And then number three, every law that we're looking at, across the US, whether it's the new proposed Federal law, or, you know, the new laws that are being discussed in Canada, they all want to understand what we're doing with the information. How is it being processed? What's the nature and purpose of the processing? And so our approach with Relyance is that if we can answer these first three fundamental questions about data, that will empower us to use that as a foundation, such that no matter which law is developed, no matter which law applies, we can then use that strong foundation of that data inventory and math and understanding of data processing, to place each pillar of each new law on top. So that enables us to prepare ourselves for whatever is to come, no matter how the law is changing. I think, as you just rightly pointed out, Debbie, the laws are changing so fast right now, it can be very difficult and a huge headache for privacy and security and data governance teams to really stay on top of this. So that's the first part of your question. Now the second part, I think one of the big issues that organizations are facing is that we are doing this at a moment in time. And a lot of the solutions out there today are a moment in time. So you go through all of this effort, you know, months and months or weeks and weeks of interviews with various teams to understand data processing. But that information is no longer kept alive. And in the same way that new code is always being pushed, and new technology is being developed. And there are new builds and new data processing; we have to have a solution that is focused on the real-time processing of information. And not only you know, just after things go live into production, but we also have data that's already pushed out and is being processed. But organizations need to do what's called shift left. Instead of trying to do a data map or trying to do privacy after the fact, we need to shift left into the CI CD pipeline, shift left so that we understand what is about to happen with data before there's actually an issue. And that's I think one of the really exciting things about Relyance AI is we have been able to help multiple customers detect and prevent security incidents. A single faulty line of code can mean data processing that we were not expecting. And with technology that's embedded into the CI CD pipeline that's evaluating code before it's actually pushed live into production. This empowers us to ensure that we get a handle on our data processing before there are any issues.
Debbie Reynolds 10:05
One thing that you talked about that I think is an issue, and a scenario that we've seen come up again and again is, let's say companies have a product, they've kicked all the tires, and everything is the way it should be and then something changes, right? So let's say they decide they want to add a new feature to something, but it creates a privacy risk. And sometimes, a lot of times, companies don't go back and reassess that change because they think, oh, well, we've already done this, we've done our due diligence, we looked at the code, but let's add this new thing. But then we're seeing the FTC particularly be able to call out companies where they're adding these new features, collecting probably data of data types, especially around sensitive data, of individuals. And maybe that's not was not the original intent of an app or application. So you know, tell me a little bit about that issue, because I feel like, you know, there was a recent issue with Weight Watchers, where they had an app where it was supposed to be like a family weight loss app, and they didn't, they got fined I think $1.5 million because they didn't properly get the consent of parents for children's use in this app. And they have people as young as eight years old using this app and doing things with their marketing data that they shouldn't have done. So a lot of what I see a lot of times is companies say, hey, let's do this app, we bring out all the heavy hitters, and we have him evaluate everything. And then they're making these incremental changes where they probably wrote out, well, let's do it for kids, too. And let's add, you know, this stuff. So companies are adding more risks when they aren't going back and looking at things. But tell me how Relyance handles a situation like that?
Leila Golchehreh 11:58
Yeah, that's an excellent question. And thanks for also raising the Weight Watchers situation with data processing as well, Debbie. So you've just highlighted in your question here the vast importance of having that ongoing understanding of data processing; this cannot be done at just a moment in time. And a lot of the solutions that are out there today are really focused on, okay, let's do a data map and inventory based on an interview I conducted with our engineering team. And it's clear, that this approach is not only not going to produce inaccurate information, but it's also going to inject risk into your organization. And to your point, we also have government regulators that are now looking into exact situations like this and even proactively so as you know, Debbie, the FTC also issued a demand onto the top 15 mobile providers to understand data processing. With each of them, what are their Data Privacy practices? What are their data retention practices, so we can no longer rely on information gathered at just a moment in time? Technology is alive, and data processing data, inventory, and data maps must also be kept alive. So and this is where I think that's shift left approach, Debbie is even more important. So in addition to kind of relying on engineers to come to the privacy or security office and say, listen, we have this new projected feature, we'd like to build this new feature or technology. And we'd like your review; we also have this need to make sure that even after a data protection assessment is done. And this is one of the things that Relyance also offers our data protection assessments where we help to automate a significant portion of that. It's really exciting machine learning tech; if there's time that we can certainly get into it. And we also compare what is said in that data protection assessment against that operational reality. And that's what I think is really important. It's because even if and this can be a, it's often not even intentional, that we just missed something, right? An organization just missed something and said, you know, we are, we expect that we're going to be processing data this way. But a faulty line of code ends up not processing the data in that particular way. And now we have a problem. We didn't get consent for that type of data processing. And so that's why having that clear understanding and that translation layer that we were talking about earlier, Debbie, is super important. We have to compare what the requirements are against what the reality is. And this is where Relyance is really standing out and where we've really been able to make an impact on data practices with our customers.
Debbie Reynolds 15:00
Excellent. So I want to talk about an issue that's in the news now. It's interesting because it's a hot topic issue. This is around like Roe v. Wade and the Dobbs decision rescind certain reproductive rights for women in the US and certain states in certain places. The reason why, in my view, this is important to talk about, especially for businesses, is because I feel like it impacts every type of business that you could possibly think of. And so, I haven't heard a ton of talk about how this impacts businesses and things that they need to do. But before we go into that, I think I want to talk about, I guess, taking a high level view about privacy in the US, especially for our international audience or national audience, that may not understand why this is the big privacy issue. So in the US, we don't have privacy as a fundamental human right. So a lot of our privacy rights exist as a result of consumerism. So if you're not consuming things, you don't have certain rights. We do have rights that are laid out under HIPAA, which is the Health Portability Act, which in my view, is really not a privacy law. I just have a part of the law that's about privacy, because you're transferring people's data, it needs to be protected. But then there are a lot of things about people's health that aren't necessarily protected in the same way because it's not a patient-provider situation. And so this Roe v Wade ruling came out about 50 years ago; it was recently rescinded with the Supreme Court ruling. And what it meant is that it will have a lot of different effects. But one of the privacy impacts is that women, certain reproductive health issues, don't have the same level of privacy protection for women. So and and the reason why this is a huge issue in the US, again, privacy is not a fundamental human right, some of these things aren't protected on a human level, kind of a consumer level. And then also, in the US, we don't have universal health care. So a lot of people's health insurance is through their employer. So this creates more obligation and more burden on the employer in a lot of different ways. So it may impact the pricing of health insurance. You know, I think some companies are trying to give travel, some type of travel health benefits people so that they can get certain reproductive health. There's a law enforcement angle to this, you know, law enforcement being able to seek information about maybe the travel, we're about some details about people's health, that definitely goes to employers. And then to your point, with the FTC, the FTC has signaled that they are going, they're asking telecoms and other big companies, how are you handling location data? How are you handling data about where people go and data brokers? We know that there's a bill pending in Congress right now. It's about the Fourth Amendment; I think it's called Fourth Amendment Not For Sale or something. And so that that bill is basically saying, you know, in the US, we have something called the third-party doctrine, which says that data held about an individual by a third party doesn't have the same protection that you and I would have if, for example, in a Fourth Amendment rights situation, which is like unreasonable search or seizure. Right. So that loophole in law enforcement can, instead of doing a proper investigation, getting the proper subpoenas and warrants they could, just go by a report from a data broker. And so a lot of these actions by the FTC, this particular law and also people trying to create more reproductive rights laws protection, part of it is to protect the privacy of that information so that it doesn't have to be spilled out to an employer, to other people so this is a huge issue. There's a lot to talk about, but give me your thoughts about that.
Leila Golchehreh 19:43
Yeah, there is definitely a lot going on in the way of privacy today in the United States. So I really appreciate the question, and you know, the Dobbs decision came as you know, quite a surprise, I think, to many people working in the area of Data Privacy and data protection, and I think if you read the decision, Justice Thomas's concurrence talks about not only kind of revisiting Roe, but also some of the other what are called substantive due process precedents. So in cases including Griswold, Lawrence, and Obergefell and what's really interesting, and no matter what your views are on any of the issues that Justice Thomas, you know, has kind of outlined in his concurrence in the Dobbs decision. No organization wants to find themselves today in the middle of a dispute between their users and the government, and what is generally projected that will happen, and largely with technology companies, today, Debbie is just a slew of various requests to turn over information on individuals, and then a particularly in light of the road decision. There is specific healthcare-related information. And you would mention privacy, Debbie. And I think something very important to point out is that we assume that, you know, our healthcare information is protected all the time. But actually, HIPAA only covers what are called covered entities, that our healthcare providers, some insurance companies, and other companies that are really tied directly to the receipt of medical services. But what we should consider is that in combination with other data elements, information that we might have once considered totally benign, right? Something that is just kind of in a vacuum, it doesn't seem like it is so important. So actually, why don't I give an example? With only three data elements, we can predict with just zip code, gender, and birthday with 87% accuracy, who a person is in the United States? It's three data elements, right? You wouldn't think about your birthday, just knowing your birthday in a vacuum is a big deal, or just knowing your zip code in a vacuum is a big deal, or just knowing your gender in a vacuum. None of these are highly sensitive pieces of information. But when you use a data model that combines them, we can predict with 87% accuracy that we know who you are. And I think this fact demonstrates the importance of every organization understanding through observability what data they have, who's accessing that information, what's being done to it, and you might think that you know, your organization is in some kind of area that that doesn't touch on any of the current events. But in combination with other data elements, it might, and I think most importantly, every organization needs to be taking a look at have they been very clear on how they will handle any kind of requests from a government agency on how they'll respond, right? Or do they know exactly how they're going to respond? Have they really mapped out their data, so they understand who within their supply chain is getting access, and also not only what their own data handling practices are, but what are the data handling practices of their third-party vendors. So they may have every good intention of protecting their users’ information, but they're using, you know, 40 vendors in their supply chain that are all touching user data in some way. Whether it's a hosting provider, or maybe they have some kind of OCR technology, or they have some kind of data processing or marketing application that they might not even have considered understanding sort of the flow down requirements of what they have promised their users, and how these third parties in their supply chain are going to be handling this information.
Debbie Reynolds 24:04
I think one of the major, major things I'd love your thoughts on this that this highlights, and I've been talking about this forever. And I'm glad people are starting to talk about this more. And it's sort of the idea in the past that data is the new gold. I say data is like a lump of coal because not all data is good data. But I think that in the past over collection of data, wasn't a big deal. So it was like, oh, we have all this data. Let's just keep it forever. So now what we're seeing is that there are real risks to organizations in many different areas, including this for over retention over collection of data. So companies really need to assess, like, why are we collecting this data. Why do we have this why are we, you know, something really crazy, as my sister says she got a note from some health provider, and it literally had her full social security number on it, because like, why would you do that? You're creating a risk for your organization that you don't have to have. So I think I'm hoping that these risks help companies really reevaluate why they're collecting data and maybe change what they collect and why they collect it. And also data retention. So you know, not everything needs to be kept forever. So I think companies if they can be more purposeful, and how they collect data in the first place, and what they do with the data retention, and greatly reduce their risk, and also their administrative burden, give me your thoughts on that.
Leila Golchehreh 25:46
Exactly, Debbie. And that is very interesting about your sister's story here. And, you know, I think everyone was using social security numbers just maybe 10 years ago, 15 years ago, it was in, we still use them to identify for various things. And it's like, this is extremely sensitive information. Is there not an alternative? And I think that this is just kind of been the way that things have been done. And sometimes, we've even forgotten. So we might have been holding on to information that we don't even know that we still have. So your point on data retention is absolutely critical. So I think there are a few things that we should highlight. And actually, I really want to bring this back to what we're trying to accomplish here. So, you know, again, no matter any personal views on the road decision for any technology organization, and any other organization that's not even in technology, user trust, or customer trust, or the trust is of anyone that you are doing dealings with is absolutely paramount. It is fragile. And once it's broken, it's very hard to come back and repair. So no matter what any personal views are, I think what's so important is that we evaluate, are we building an environment? And are we processing information in a way that's going to facilitate trust? So a couple of things that I think really every organization should be looking to do right now. And I think, particularly in light of jobs. So first and foremost, you know, we talked about the data inventory and mapping, you know, and shameless plug. But that is exactly what Relyance does because that's the foundation for building a global data protection program. There's also a privacy center that really every organization should look to build. I think, you know, Debbie, one of the other issues is that these privacy statements, people are just not reading them. That's a reality. People just don't read them. They're hard to read, you know, it's buried in legalese, like, just told me what you're doing with my information in terms I can easily understand. So make a privacy center visible, make it easy to understand, use plain English, don't use a bunch of legalese, and just explain to people exactly what you're going to do with their data, something that Steve Jobs said, right? You know, tell people what you're going to do with their data, tell them every time ask them questions on can you do this with their data, ask them every time and until they tell you to stop asking them and it's so nice to see that has, you know, continued to be true even through today. And I'm sure you recall, Debbie, that, you know, with the San Bernardino situation, Apple was hit with a subpoena that, you know, we technology companies may start to expect right now, given some of the things that are happening in the country today, to offer a backdoor so that the FBI could gain access into a suspected terrorists cell phone, and Apple, you know, they have a lot of resources. And so they could fight the subpoena because they understood the value of user and trust. And that if they allowed this, it would set a horrible precedent for the ability to access individual information. And Apple thought this, you know, and risking prosecution for failing to comply with a subpoena issued by the government, they decided to fight it. Now, companies also so, in addition to the privacy center, really need to take a look and think about if they get hit with a subpoena to hand over information about their users. How will they respond? Are they going to comply with that subpoena? Are they going to inform the data subject? How exactly are they going to respond? So that's really important. And one of the other issues I want to focus on is maybe companies can just avoid that need in the first place, to your point, Debbie around data retention. If we can just delete data we no longer need and only keep it for the period of time that we need that data, then we might not even have to deal with this. And I think that's probably the best place that any organization can find themselves because, as we mentioned earlier, no organization wants to be in the middle of this dispute, you know, against, you know, potentially prosecuting their users, not just women also men And, and one of the best ways to avoid that will be to ensure that we delete data that we no longer need.
Debbie Reynolds 30:08
Yeah. And I will also say, you know, I knew someone, they said that they had an app for their job, and this app has a location on it right? And sometimes you may need that for your job to drive a truck or something like that. But if these apps are on your phone, some organizations, they're continuing to track people when they're not at work, and they're not working. So that's also a loophole that organizations really need to look at. So if you're not, if you're making a conscious effort, to make sure that you're not tracking people, when they're not at work, or different things like that, I think that will create like less of a risk for the organization and also help protect individuals. So we all know that organizations are going to do what it takes to follow the law. But I think that organizations can greatly reduce their risk and their administrative burden if they can easily say, hey, well, this data we don't collect, or we only have x, so being able to tell that story and not have to scramble to be able to answer that story, I think it's a better place for organizations to be. Also, when you mentioned Apple and San Bernardino, you know, Apple has constructed their apps and their tools in a way that, for example, certain passwords and things like that, certain things on your phone, they they don't have the capability to unlock it and see what's there. Because that's the way that their product was developed. So you can't make someone develop something that is the antithesis of their product, right? So I think what we're saying for organizations is look, you know, figure out what data that you need, especially if it's personally identifiable information of your employees, figure out what your risk is, in a situation like that. And if you were ever asked to turn over information with that, would that information be something subject to something like that? And then, if you need it, so you, as a business, can decide what data you keep? And what data you don't keep?
Leila Golchehreh 32:15
Yes, 100%. And I think one of the really interesting and also exciting things about our technology is that we are always surprising and careful about what information they have, right? And, and, and also, just with respect to retention, it's just so helpful to see visually on a graph exactly what information is being processed because that can change at any given moment. And as a data protection officer myself, one of the things that always kept me up at night, Debbie, was I would sign these great data protection agreements, I would write these privacy statements, but from the moment that I finalized that and published it, or the ink dried on the DPA that I had executed, I didn't have visibility going forward, right. And so that ongoing visibility and understanding through observability technology of exactly how information is going to be processed, who's accessing that information, and what they're doing to it is absolutely essential, certainly in light of jobs, but also, just generally, you want to make sure that you know, to the issue you just raised with your sister, why were they still holding on to a social security number, I mean, data breaches can be extremely costly for organizations. So you know, whether or not the Dobbs decision came down, this is just good practice. It's good hygiene, and it ensures that you maintain user trust if you are a b2c company and maintains customer trust as a b2b company. And, you know, I would also point out that even as organizations are carrying out robos, or data protection assessments, they should be maintaining a record of whether or not one of the companies they're doing business with has been involved in any kind of security incident. There is a significant push toward more transparency and data processing; tell us exactly what you're going to do. You know, channeling Steve Jobs here, tell us exactly what you're going to be doing with information and how you're processing it. And just make sure that you're clear across the board, getting rid of data you no longer need, keeping it only as long as you need, and also minimizing the amount of information that you need.
Debbie Reynolds 34:25
Yeah, I want to make a quick point. And then, I want to talk a little bit about data minimization. So I had a reporter call me this morning, asking me about the Amazon acquisition of One Medical and the privacy implications of that, and I have friends that like it, you know, they liked the fact that they have Amazon Prime membership, they have a Ring doorbell and now they can use the same login and password to get into their medical stuff. Right. But, you know, some other people are like, you know, I really want to separate that data. Part of that is people want to have more transparency about what is happening with their data, and how companies are using it. So definitely, I think people are asking more questions, which I think is a good idea. I want to talk a little bit about data retention in terms of being able to handle legacy data. So I think companies, and this goes toward litigation and subpoenas and document requests, right? I've been involved in that process for decades. So I know way more than the average person knows about these types of things. And what's always surprising to companies when they end up a subject of a litigation or documentary class or subpoena, is how much legacy stuff they have that may be subjected to those requests. And a lot of times, it's stuff in back rooms, things on servers, things in clouds, and they forgot about maybe the people who, you know, the knowledge about that data has aged out within the organization. So no one can really tell you, and I feel like this is a way that you can explain to me how Relyance works on this, that companies can really leverage technology to be able to figure out what is all this stuff. What you know, why do we have it in the first place? Why do we need it and maybe be able to get rid of it? What are your thoughts?
Leila Golchehreh 36:31
Yeah, great question, Debbie. And this is why we say privacy is in the code. So there is no way we can fully appreciate how much information is being collected on any given SAAS, SAAS application that's being used unless we actually are looking to that source of truth. You know, conducting interviews with various team members, you know, this can be informative, but it is definitely not complete. And what we do is rather than having, you know, data protection, professional kind of chasing down teams trying to get answers to information, and as a litigator, you'll appreciate Debbie, you know, if we're in litigation, and we have a witness who's relying on hearsay, you know, that's not admissible in court. And here we are with privacy, which is a fundamental human right. And hopefully, we'll get to the point that we fully recognize that, and the way that we handle people's information is more on the basis of what someone told us is being done with data. That is hearsay, that is hearsay. So our approach is, we can no longer continue to rely on what people tell us they're doing with information because they interviewed someone, and that was someone's best guess from memory. I think this is what we're doing. Even beyond that, it also really stresses out the engineering team to get questions from a lawyer on, you know, how are you guys processing data, it's just, that it's not fun for anyone who is involved. And they're doing their best their level best to understand data processing at their organization, but they themselves don't often know exactly how information is processed. And when you're an old company, you know, and I've had clients that are decades old, previously, Debbie, and you know, your legacy systems, code repo upon code repo, there's information being processed, that you just didn't know about. And then to your point, Debbie, as soon as you get hit with that subpoena, you go, okay, let's go find it. And then that's what activates your search to get rid of data? How about we try to avoid needing to respond at all because you've already deleted that information? That is our approach and especially given the court that we have now with this particular Supreme Court and some of the language and that decision. This is not the first case relying on the substantive due process privacy protections that might be overturned, and I believe that this Court might just be getting started with some of their evaluation of how far we extend that, you know, quote, unquote, Constitutional right to privacy. I mean, I, you know, I have my personal views. We do have a Constitutional right to privacy. And then there are various sort of conservative views that we do not because the framers didn't intend that to be embedded into the Constitution. But nevertheless, we currently have a court which is saying that you know, privacy rights are not necessarily afforded under the substantive due process clause. And these decisions will be revisited. And so, where do we want to find ourselves as a tech company right? Do you really want to be in the middle of this dispute over what is considered to be a right to privacy or not? Or do you want to just do what's right for your customers and for your users to make sure that you are protecting this fragile, fragile trust that You have gained with people who are sending you their information and trusting you to process their information for the purpose that you collect of it right? No one is sending their information to technology companies so that that information can get handed over to third parties, whether that be third party vendors or the government, to do something that they're not aware of. So this is the time, now is the time for organizations to get a complete and clear understanding of exactly what is happening within their own systems because quite frankly, most companies just don't even know.
Debbie Reynolds 40:40
That's great. So if you were the world, according to you, Leila, and we did everything you said, what would be your wish for privacy anywhere in the world?
Leila Golchehreh 40:53
Oh, wow. What a question, Debbie. So I think that's a good one. At this point, Debbie, I think we we're really at a critical juncture. And privacy is probably undergoing one of the most important shifts in our lifetime. And with COVID, and the technology eating the world transition, there has never in our history been a more important time to protect people's information, now is the time to do that. And we cannot go about protecting individual privacy rights unless and until we understand first exactly what our own practices are. So in an ideal world, we're first looking inward to understand exactly what we're doing with people's information. And we're clear about it no more surprises, and we're just not living in a world anymore in which we can surprise our users or our customers about exactly what we're doing with our information; we need to be transparent. And we need to have full observability. So and I would say this, this applies certainly inward within our own organizations, and also within our entire supply chain. It's incredibly important that we don't just look at it, even if we are the most well-intentioned organization, about how we handle people's information, that we're not just looking inward, but we're also looking to all the third parties that are touching that information. 70% of security incidents happen with our third-party vendors, that's very significant. It means that we can no longer just rely on okay, we've done the right thing, we've done our privacy statement. But we really need to take a holistic look at what's happening across our entire supply chain. So in my ideal world, we are protecting information and assuring that third parties accessing information are also doing the same. We flow down the same requirements that we have promised our users to build that user and customer trust, all the way down through our vendors. And we have a real-time picture based on actual information and data flows about how information is being processed. And we no longer have to rely on hearsay or what someone tries to remember about how they're processing information.
Debbie Reynolds 43:19
That's amazing. Thank you for that. I like to say I think privacy should be more operational as opposed to aspirational. So people, often people write these big policies about what they wish happened or they think should happen. And that may not be the reality. So I think companies need to get back to basics, back to reality. And people don't have to guess because we have tools now that can actually tell you the truth; they can tell you exactly what's happening. So you don't have to wonder.
Leila Golchehreh 43:51
Debbie Reynolds 43:53
Well, thank you so much for being on the show. This is a great episode. I'm sure the audience really likes it. And I look forward to collaborating.
Leila Golchehreh 44:00
Likewise, thank you, Debbie. And thanks for asking such great questions that I think is a very pivotal moment in the world of privacy and the protection of people's information. And we're just at Relyance.ai in case anyone is interested in how we can help in this sort of new frontier of privacy rights.
Debbie Reynolds 44:19
Excellent. Excellent. We'll talk soon.
Leila Golchehreh 44:22