"The Data Diva" Talks Privacy Podcast

The Data Diva E41 - Adi Elliott and Debbie Reynolds

August 17, 2021 Season 1 Episode 41
"The Data Diva" Talks Privacy Podcast
The Data Diva E41 - Adi Elliott and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, "The Data Diva,” talks to Adi Elliott, Chief Revenue Officer at Canopy, Data Breach Response Software. We discuss his affinity for data problems, assessing and managing data breaches differently, requirements for data privacy as a result of a data breach, responding and gathering the right information to manage data breaches, the problem of data redundancy, trends of how organizations manage after data breaches, Cyber insurance, phishing emails, advice for companies not yet breached, legacy data concerns, less sophisticated breaches are still dangerous, cybercrime coordination, a need for collaborations education on data breaches and his wish for Data Privacy in the future.

Support the show

42:21

SUMMARY KEYWORDS

breach, people, data, pii, company, canopy, phishing email, review, business, world, person, document, compromised, ediscovery, space, regulations, hacked, insurance, software, thinking

SPEAKERS

Debbie Reynolds, Adi Elliott


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own, and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds and this is "The Data Diva Talks" Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that business needs know, right now, I have a special guest on the show, Adi Elliott, he is the chief Revenue Officer for Canopy software.  Canopy software handles data breach response, is that correct?


Adi Elliott  00:38

That is correct.


Debbie Reynolds  00:39

Sweet. So this is a super fun episode, I think we're gonna have for many reasons. First of all, you and I have known each other for, you know, I had to look back many, many, many, many years. And we sort of crisscross in different types of jobs and things that we've done in the past around data. And we've actually had a lot of fun over the years, having lunches and having these expansive conversations about everything, right. So it's kind of funny to record one of them. It is basically we're just recording our conversation, the thing I really like about you is that you really bring good energy. So you have a lot of energy and a lot of passion about data, right, and sort of data problems. And I feel like when we talk, we talk a lot about the future, which is a topic that I'm very interested in. So we talk about what we see happening right now, what we think is gonna happen in the future. And you're kind of a renaissance man, where you can talk about almost any type of topics. So it makes it so much fun for me to chat with you.


Adi Elliott  01:47

I appreciate that. Likewise, it's always super fun.


Debbie Reynolds  01:51

Yeah, well, tell me a little bit about what's your, tell me a little tell the audience a little bit more about you, or anything's I left out anything that you want to tell us about Canopy data breach response software.


Adi Elliott  02:04

So it's, it's a really interesting space in that, one it's something that we all know about now, because we read about it in the newspapers pretty regularly from ransomware attacks to state actions. And, heck, it even affects our daily lives sometimes, and we're getting letters in the mail about it all the time. But what's interesting about it, from my perspective, is that one, the quite sensible regulations was really way in front of the technology. Because if you think back like 10 years ago, this was barely a thing, if at all, it wasn't something that people often considered because I came from the ediscovery space. And I think back in, in, like 2008, when we were really getting started in my world in the modern world of ediscovery for me, and we were all doing these demos with the Enron data set. And there was just so much pie in that data. And it never occurred to anybody credit card numbers, social security numbers, all sorts of personal data that was in there, and just nobody cared. Nobody, it's not that nobody cared, we didn't even think about it, it wasn't even in the ether. So data breach notification was a space that came from regulations. And at first, people were just using whatever tech they could to do it because the regulations usually say that if if people's PII their personally identifiable information is compromised, then you need to assess the data that was compromised, or if data at all is compromised corporate data, you need to assess it, figure out if there's PII in there. And if so, let those people know, so they can take reasonable action. And what people were using was technology mostly designed for either searching or for ediscovery review, because at a really, really high level, data breach response kind of looks a bit like ediscovery. Because you're technically processing data, the data that was compromised, you're technically calling it down and reducing the data set, kind of like you do ediscovery, and then you're reviewing it, and then you're producing something. So like it this really high level, it's the same, but as soon as you get into the weeds, it's like super different. You're, you're processing it really just for the PII. And then when you reduce the size of the data set, you're not reducing by keywords and dates, and custodians, like any discovery, you're really just caring about what has PII in it, and whose data is it. And then when you're reviewing it, you're not really bringing legal determinations to the review. You're not looking into documents saying is it relevant, you're looking at a document saying is there PII here and if so, Whose is it? So and you're stitching up and making connections between the data and then your ultimate deliverable isn't even a document. It's really just a list of people and the PII that was affected. So Canopy is really the first software in the whole world that's focused on that problem start to finish from the we have a raw data that was compromised to okay here's the list of people and the law firms and, and insurance companies and corporations cannot decide how they want to handle it. And if they if and who they need to notify, that's Academy does the space is interesting, because it's really important, I think in terms of doing good in the world and letting people know as fast as possible, because there's a huge time constraints that they they need to, they need to maybe make some changes of their passwords or update their credit cards, or whatever it is. But it's also just an incredibly interesting data problem.


Debbie Reynolds  05:24

Yeah, yeah. I agree with you. And not all my audience, quite actually, quite a few of them aren't in discovery at all. So out this at a high level, I'll sort of make that parallel with you, which I think is true. So in the discovery process, there was kind of a lawsuit or some type of initiating action, right? Yeah, to start this whole clock ticking about how data gets collected and exchanged between parties, whoever they are. So whether it's, you know, a lot of company or a government or something, and here's the process, and then the data gets modernized in some way, a review for legal issues, and then sort of exchanged. So that's kind of a discovery at a high level. And like you said, when people are thinking about breach response is sort of feels it sounds like that process, right? Where there's like an incident to happen, correct? Yeah, we need there needs that there are some fact finding that happens. There is kind of a data exchange, even though it may not be document for document, it's more kind of statistics, right?


Adi Elliott  06:30

Yeah, it really throws people off because it's a difference between the exchanging documents relevant to litigation, versus I just need a leased list of people. And I need to know which of their data was breached. So we can let them know. And it turns out the devil is very much in the details. And those are like radically different endeavors. So I get why people were using ediscovery tech for a while. It's it's all they had, but.


Debbie Reynolds  06:56

Yeah, right. Yeah, we're trying to solve a different type of problems. So that's that also is a very expensive process, right? where it really doesn't require, in my opinion, kind of legal knowledge to say I have 100 customers, and the this is the PII I had about those folks.


Adi Elliott  07:16

Right? Yeah. And that's another difference in it is that in, in ediscovery review, you're you're literally using attorneys, licensed attorneys who were using making a legal determination on the document, still, their job is to like look at a spreadsheet and and say yes or no, is this legally relevant to the to the matter at hand, the litigation based on my knowledge of the project of the litigation, whereas in data breach response, it doesn't have to be a lawyer at all. And it's often not, you're really looking at the looking at the data, that same spreadsheet, you're you might look at and be like, alright, there's a list of like, 3000 people here. And there's a lot of credit card numbers and social security numbers. So I need to somehow like know, all of that. Right. So that's just, I mean, it's not even about the document being relevant or not, no, say it's, it's yeah, so it's just super different.


Debbie Reynolds  08:07

And then also, I feel like in this, the way that you need to collect this data and report on it, you can really bring a lot more technology into it. Right?


Adi Elliott  08:17

Yeah. And it's really a I mean, for us at 30,000 feet for Canopy, one way to describe us is just like an AI Cybersecurity company. Because that's ultimately what we what we're doing is using a whole bunch of AI to, to bring to bear on this problem. And, to your point there in a discovery, like machine learning is something that like is this extra thing that maybe happens over there. So like, first you process the data. And then if it's like really big case, or a really fast moving case, or if it like has certain dimensions, then you might pay the extra money to use these analytics on it. Whereas in an AR world, but like the machine learning and the AI is like literally part of processing, because every single time like you just do that as part of the process, it's not some extra thing that happens. It's like the the table stakes of the endeavor is to use AI.


Debbie Reynolds  09:10

Right? Absolutely. So what Well, one thing I want to bring up, and this is something I do in my work, as I work a lot with early stage and people who are doing emerging technologies and stuff like that. And you know, sometimes when I'm talking to a developer, and I tell them, we need to have fields know where people are, like, what country they're in, and then the US we need to know the state. So we ever need to report that information. We have it right, without something going through other extraneous efforts. And literally, I've had people have their head spin around. Once I told them that they don't, there, they may not be really thinking about the data in those ways.


Adi Elliott  09:51

Oh, absolutely not in the course of business. It's like you don't everyone in these outside spaces. It's so easy to forget that that people using data are just trying to get a job done. The largely and people just don't think about the regulations day in and day out as they're just trying to do whatever their job is. And, and next thing you know, then they hit a phishing email. And now all of a sudden, it's it's a different endeavor. And unfortunately, we live in a world where it's not even, like companies can largely take all the right steps and make all the right precautions, and still have compromised situations and still basically get breach. So it's not really and that's not even any more like a oh, the boogeyman is gonna get you it's more just that's that's just the way the world works these days.


Debbie Reynolds  10:44

Yeah, yeah. And then so one of the reasons, you know, beyond us knowing each other, and I think you're really awesome, that I thought it was really important to have this conversation is because when you hear about breaches of the news is sort of oh my God, that's coming out breach, and then it's just hair on fire. And then they don't say kind of what happens next. So yep. So what happens next is what you deal with. So let's talk, let's talk through this scenario, right? So so we have a company, let's say, as a mid sized company, 100 million dollars revenue or something? Yep, maybe they have. They have operations in the US, but they have customers around the world or whatever. And they have a data breach, what happens next?


Adi Elliott  11:29

So a huge percentage of the time they have cyber breach insurance, I'm probably 80 plus percent of the time at this point. And essentially, the insurer largely drives a lot of the process, because they tend to be the kind of hub of everything happening. And plus to the company. Essentially, their main thrust is there. They're definitely thinking reputationally oh, this is bad. like are we don't want our customers data out there or our employees data. So let's get our arms around that. But plus the on top of that, how expensive is this going to be to us? So they contact their insurer, the insurer usually has several folks that they say that they should talk to. And sometimes some of them are already embedded, like the the digital forensic and incident response people, those are the ones that say, okay, this is the data that was compromised. So they'll like definitively say it was like, these three email boxes, or this file share over here. And then attorneys get involved in usually, again, oftentimes the, the insurer, the cyber insurer will often loop them in, or at least say, hey, select from one of these folks. And they'll say, okay, now we need to like with the scale that we're talking about here, we need to go deeper and figure out what our notification obligations are. And that's kind of where we're Canopy or at least our clients, because we we sell to people who then use our software on a project by project basis. But then what they need to do is assess those, like three mailboxes are that file, share, go through it and figure out okay, what's the PII in here, and oftentimes, I mean, it's very much needles in haystacks every single time. So and it can vary based on what the nature whether it's a hospital or some sort of healthcare company, or retail company, or just like a company that needs like vitamins or something, just run of the mill stuff to where it's at their client files, is it HR files, but essentially, once they have that, the, let's just say the mailboxes, they someone will run it through our software. And somebody usually takes about a day, they'll say, okay, we figured it out out of, like 500,000 records, here's the 20,000 that actually have PII in them. And now human beings actually need to go through and like say, okay, is that a social security number? Yes, it is. It does it belong to that person over there? Yes, it does. And our software facilitates all that and makes it really fast and easy. But essentially, people need to go through and validate Yes, here's all the PII. And here's who it's related to. And then lawyers are kind of coming in along the way and saying, okay, we're hitting notification thresholds over here, we're, we do need to notify about x and y, and z. So they're advising every step of the way. And essentially, in a large part to everyone's trying to keep it keep the new one, you have to go really fast. So because there's like the regulations speak to a speed component, and for as a for instance, in the EU under GDPR. And in in the UK, as well, they still follow the GDPR regulations on this post Brexit but you only have like 72 hours from the initial incident to notify the regulatory body of the general dimensions of the of the compromise, which is incredibly fast, and it's so fast like that, I would argue, without Canopy, like you could not effectively do that. Like meaning. If you know that it's not a great regulation. It's good, but this is one of those The regulations drove the software. But if you had like 500,000 records or a million records, and you needed to substantively tell a regulatory body, what the nature of a compromises without our software, that would be incredibly difficult to do, because you It probably take you like, like a couple weeks or whatever. But in large part, I mean, you got to go through all that data and just assign it and get it down to, essentially, a clean list of here's the individuals, whether it be hundreds, 1000s, sometimes millions, and exactly what are their PII was breached. And then at the very end of the process, you got to let some let people know. And that's where we get those letters in the mail. And I kind of read through the whole thing you did.


Debbie Reynolds  15:43

So that's cool. So there's actually sounds like you have almost like a gamification component here, where you're telling where the person you're serving up information for them to say yes, no, or do some type of validation on very rapid basis.


Adi Elliott  15:59

It is.Yeah, because the the PII the heavy lifting on the PII is really about surfacing. The heavy lifting on the machine learning is queuing up the PII as fast as possible, because you still want human beings to validate it. You still want human beings to actually say yes, this really is PII, yes, it really does belong to those people. But making that as fast as possible. And also making sure that you'd have to reinvent the wheel. Like if you discover what the someone's PII was, you don't need like seven other reviewers to keep rediscovering America on that.


Debbie Reynolds  16:31

Right? Yeah, exactly, exactly. And then to a lot of companies have duplicative information, which makes it even harder, right? where, you know, you just it, it just exponentially increases the amount of data that they have to actually look through. Because some of it is duplicative, they only have to really report it once. Right?


Adi Elliott  16:54

Yeah, it's really just because if you like, if you think about that end goal, if you're, if you and we've almost all been that person, I mean, bottom line, I just want to know, like, what as a human being, what what was compromised, like, what is potentially out there now, and what's the action I have to take. So it doesn't really matter if you hit my credit card number 17 times, I only really care once, like, I just just let me know. And I'll say like, one thing that's really interesting, and I wouldn't, and again, like this is one of the things that's most interesting to me about data breach response face is this really is a space where the regulations drove the technology, the technology very much wasn't there originally, like you couldn't very effectively do this in one easy spot is that initial calling and like getting going from a million documents, which as you know, like, like, you get a few mailboxes and you're in a million documents very fast. And, and, and review just looking through stuff, it goes quickly. So getting from a million records to here's the like 50,000 that actually have PII in them before canopy, what people had to do is like kind of two main ideas. One, they would run search terms. And, and two, they would run regular expressions with regular expressions or just kind of simple pattern matches. So like for a phone number, it would just be like the pattern of a phone number, or for a credit card number, the pattern of a credit card. And, and a lot of PII just doesn't match keyword searches, like it doesn't politely show up in a keyword search when you're looking for. And frankly, keywords aren't meant to find social security numbers. So looking for the term social security number SSN. A lot of times a social security number doesn't show up, when you've put in those keywords and the kind of conversely, if you just use the simple pattern of a social security number, you hit a ton of stuff that's not that, and then you miss stuff that for whatever reason didn't didn't exactly match the pattern, but still was a social security number somehow. So you end up in this weird scenario where that you're kind of over inclusive and under inclusive simultaneously. So you're reviewing tons more data than you needed to. But it wasn't all the right data. So that's kind of the worst of all worlds.


Debbie Reynolds  19:03

Right? And then sort of to your example, about someone's credit card showing up 17 times and the data set, you needed to only notify them once or account them once. Right? Yeah. Because regardless how much data is still that one person that needs to be be notified. So to me that that illustrates the the weakness of trying to do it almost like a legal review, right?


Adi Elliott  19:30

Oh, totally. Because if you think about like those legal reviews, they think about deduplication on the document levels, they don't think on the like, micro piece of data level, they're not like deduping the idea that we've already reviewed, literally this one little nugget inside of a document this credit card number, so for them, if an email is literally the same email like if I send you an email, then if they review my version from the sent box, then we don't need to review your version from the inbox. Don't make sense. But the credit card number could be, it's going to be in wildly different documents, it could be in a spreadsheet or in one place, then in some sort of form, and seven other places and an email over here. And so in a text file, the way it was stored, and somebody that like, it's going to be in a lot of different places. So the even the entire nature of duplication just works super different.


Debbie Reynolds  20:22

Yeah, it doesn't lend itself to this. And then I think trying to do it the other way makes it you know, exponentially more expensive to and unnecessary.


Adi Elliott  20:33

Yeah. And, honestly, so that's like, essentially, like, this was like a really funny one. But we had a, we don't normally get to talk to the corporate clients, because like, usually, it's our clients. Well, almost exclusively, our clients are the ones engaging with the corporations and or the insurance companies. But for whatever reason, one of these projects, we ended up hearing from the company itself, and they literally, for them, based on what the initial estimates were, like, using the, the kind of the legal review workflow, it was gonna cost them like $3 million, more than then they ended up spending with Canopy, like, and, and that's the sort of thing that like, I was like, man, can we put that in case study? And he's like, nah, we don't want to talk about like, how we were breached. I think that's a great, like, it's not our favorite thing. We love that we handled it quickly. And we got what needed to happen happened. But it's not a story you get to tell. But I mean, it's just a night and day difference.


Debbie Reynolds  21:31

So you touched on a point is extraordinarily important that I will love to chat with you about. And so you probably have some insights that you can share, hopefully. So what's happening in the breach space, and I think this is a reason why cyber criminals are so successful is that when people are breached, they don't want to share. It's true. They don't want to share their learning the things that they learned, first of all, they don't want to they don't want people to know their breach. Right? So yeah, so someone knows it, then they try to keep it to their chest as much as possible, you know, they aren't very forthcoming about how it happened, or how, like how a tactic that was used by a cyber criminal can help them have other people, right? Avoid the same things. So that benefits criminals, because then they can do the same trick on the next person. 100%. And the next person, right?


Adi Elliott  22:34

Yeah. And weirdly, it's it's one of these things where, too, they can't it's hard to speak out publicly about it, because it almost invites litigation. Because there's like a whole again, I'll go back to, I would say most of the of the companies that we end up hearing about that this happens to like a lot of them are doing all the right things like yeah, like they're doing, they're taking the right steps. It's just, it's a very difficult problem to prevent. It's not as if people are just ignoring this problem, and ostriches their way and just oh, it happens. And they weren't trying, though, they were trying really hard. There was a lot of people that didn't want this to happen. But then if you surface it, then it invites the idea of, well, if they're talking about it publicly, maybe there should be some sort of class action, maybe there's more money to be had here. How like, yeah, in that, that's, it's a whole separate thing. So then people don't benefit from the learning of at all.


Debbie Reynolds  23:30

That's right. So as a company that does this on a high level, obviously not about particular clients. What trends are you seeing in this after breach space that we may not even be thinking about? You know, I'm sure there are some patterns of things that come up that that you may know of just because you work with companies that that have these incidents like the after incident? level? Are you are you seeing people here hair on fire? Are they, you know, are they do they seem to be sort of adjusting how they operate as a result of this? Or I mean, what do you see?


Adi Elliott  24:12

So, this is here's a weird trend that it's even hard to figure out how to talk about but I guess I'll, it's wild how often these projects end up costing almost exactly as much as the breach insurance policy covers. Like that's just a weird trend that we kind of run into. And that's it's an incredible coincidence that the cost of the projects are exactly what is covered by insurance. And, and for us I mean transparently, Canopy throws a wrench into all that and that's like one of our biggest challenges is that things get a lot cheaper, and like some some people aren't wild about us for that. But it is a weird trend and here's the the net result of that is breach. Insurance is going to get even more expensive. Like, breach insurance cost is going up some if you noticed in the space some, some insurance companies are getting out of it because it is more expensive and everyone's kind of maximizing to the how much will be paid? And it's a weird like, yeah, it's a weird thing, because it's such a, it's still pretty new as a space. But I don't know that that's a big trend.


Debbie Reynolds  25:27

Well, I see insurance. So I, I anticipated, as a lot of people did that, you know, the insurance market will tighten up on cyber, because there are so many breaches and so many people having breaches that didn't do kind of the basic stuff. Right? So, you know, obviously there, there may be some Mission Impossible Tom Cruise hanging from the ceiling, breach stuff, but that's not really the majority.


Adi Elliott  26:00

I guess, like, here's the here's another big trend. And this is like the reality, but the vast majority, like the majority, the vast majority really are just phishing emails. Just straight up. It's not the majority of breaches are not a sophisticated like what you read about the new york times where some fancy bomb was planted, like, like digitally. And then like they put it in the parking lot and USBs were picked up at somebody like, it's just like, somebody got a phishing email that looked like a real email, and they clicked on the wrong link. And next thing, you know, like, they've been hacked. And and it's just that simple. And the phishing emails get better and better and better. But that's, that's the majority of it. Like four to five? Yeah. Like, maybe more like, it's just a huge percentage. And that's where, like there's this balance between, because if you think about it, like people have to be able to use email to do their jobs. But for us to find like, the email is the number one place where these these these intrusions are coming from, and it's those links, and you can't just, like not allow people to click on links in emails, we wouldn't nobody would be able to do business. We can't do that. So like, what do you do there? And I don't know, there's a good solution. And obviously, it's just an endless cat and mouse game. And frankly, I don't know that this is ever going to be a definitively solved problem. Yeah,


Debbie Reynolds  27:21

I don't know, I feel maybe I've talked to some of my other cyber friends about this as well. But I feel like, and maybe I'm wrong. But when someone clicks on a phishing email, what happens is, the person that sent the link is able to manipulate or take advantage of the permissions that the person has within the organization. So a lot of that, to me is about access of individuals. So like, in my opinion, again, it's titled My cybersecurity friends. But if you click, let's say, a low level employee that doesn't have a ton of access, they click on a phishing email, if they don't have a ton of access, I would think it will be limited and what the cyber criminal could do. So I almost feel like the higher level of the person's access. And this, this can happen a lot of ways. Either the person's like a high level person, the organization, or the person, let's say their access controls, they have more access than they should have put it like that.


Adi Elliott  28:32

Yep. And to be fair, even if the only thing the person has access to their own email, like their own email is most of the breaches like just their own PST, their own like mailbox. And in every single person's mailbox has a shocking amount of PII in it just from just the course of business, like a random marketing person, like the marketing coordinator, for some random company has way more pie in their mailbox than you'd expect. Just because doing business, we just throw it off like crazy. We're just interacting with things that it's all reasonable. And and on the individual basis, but as soon as, as soon as you get a somebody that actually has access to the entire thing. in aggregate, then it becomes a bigger problem. And frankly, then it has to be reviewed. And for us, like even if it's like 300,000 records, and then you get it down to like 15,000 that have PII in it unfortunately, like that 15,000 like matters to the people, to the people whose PII it is that's an important 15,000.


Debbie Reynolds  29:36

Right? exactly right. If you're a person, this part of the data that got breached is pretty important.


Adi Elliott  29:42

Yeah, yeah. And, I mean, it's a cool like, it's, it's cool to at least be able to help with this. Like it's something that I'll say this, like, it's something that I feel really good about, like waking up every day and trying to help solve in some way like that. It feels good to at least be on that side of it. I think both of us coming from discovery like that was one of the getting into more like the security and privacy space is like, enjoyable in that way and that you feel really good about it and that like all in yet, a lot of the learnings from the discovery are still useful over here.


Debbie Reynolds  30:14

Yeah, you're set to do good or no, no, I'm a very data person, some zero zeros and ones hobby. Yeah, to me, the glass is like two thirds full? Yeah, absolutely. Absolutely. What do you think? Or what, what would be your? What would be your thoughts? Or what would you say to a company that hasn't been breached? yet? Right. They've not experienced breach yet. And then you know, what types of things that you've experienced with that you would advise one, if you go back in time, tell them something that may help them in the future.


Adi Elliott  30:55

So something that that we are starting to see more of and, and from the nature of what we do is, companies it least using using what we do, instead of it being reactive, it can be proactive, so we have a client that thinks about this as is, rather than then doing it as a breach exercise. They do it as informational about how they're storing data, because obviously, they have the places they're supposed to put data. But the problem is, is that people just do their jobs, and they don't really think about the data. So what this organization does is they grab, they on a monthly basis, they just kind of take a sample from either a line of business or or one of the departments within the company, they sample the data, and they look at they just want to get a look at like what sort of PII is being stored? And are we aware of it? And does it comply with kind of the way we think things are? Or is it duplicative meaning is the same PII live somewhere else where that person could access to, and they just kind of rinse and repeat on a regular basis. And what they find is that it first it's a trailing indicator, and just kind of a, here's what we're doing. But over time as as the line of business leaders, and the C suite understands the risk profile of the company like, hey, if we did get fished, here's the sorts of things that we could do to button ourselves up, that we'd be in a better spot from a data standpoint, it becomes a leading indicator, because people realize, Hey, this is duplicative. We do have the same data over there. We don't need this to sit here, we can get rid of this data. And I always thought like, I'll say like, I was never a break, like personally, and I tried to make it work. But I could never like get the information governance dog to hunt personally. Meaning like the fishing expedition of it, of particularly on the the like, Hey, you have a bunch of useless data, get rid of it. Like that, because just no one cared. And no one would review it like, like, that was always my like trouble there in from coming from the discovery space, as if, hey, think of how much data is useless. I'm like, yeah, that's fine. But whose job is it to? Like, review the data. And, and but but when it's like real risk profile stuff, when it's, hey, we actually have PII that does have regulations around it. It's a very different story. And the chief risk officer and the chief privacy officer have just a different view of that.


Debbie Reynolds  33:20

Yeah, I'm very proactive. So I'm very much in a proactive stuff. And probably I'm sure people have heard my rantings and ravings about, you know, legacy data, or data that that may have kind of lagging business utility that's in a back room, somewhere in the cloud, somewhere that no one cares about. But in a cybersecurity situation, you know, that data, it may not have, it may not be a huge asset to the company at the moment, but it may be a huge risk.


Adi Elliott  33:51

Totally. Yep, that's the thing that's gotten people to care about it from my perspective, like just telling them it's useless just didn't move the needle, telling them, hey, you don't need it. They're like, yeah, it's fine. What's it cost us per month to like, like, essentially, whether they're in Microsoft, or AWS, or whatever it is, or it was just their own servers, they're just like, whatever, it doesn't cost as much to host it. It doesn't cost as much to have it there. But then when it's like, no, literally, it's like, there's a ton of PII in this. And it's really risky. That and frankly, it's easy to convince people of the utility of that when they're reading in the newspaper constantly of all the other companies that have gone through this exact thing. Whereas this is like you're literally reading about every week, another company that was breached. And for every big dramatic one, there's a lot more small ones that aren't like some sort of thing that moves the national dialogue that are just just a run at a middle company and not so important stuff, but important to the people whose info was breached.


Debbie Reynolds  35:24

Yeah. And I think one of the things that we have, so it seems like they're talking about breaches more in the news, but I think that's more because of the big companies that are being breached. And these very sophisticated breaches that are happening where there are probably less sophisticated breaches that are happening, that people aren't hearing about, because these aren't really big companies. So I think it gives people a false sense of security, because, you know, just like the pipeline breach where the company pay ransom, and the government was able to get their ransom back. Like, that doesn't happen.


Adi Elliott  36:01

No, you don't have the FBI and the president commenting on your breach most of the time, you definitely do not get that benefit.


Debbie Reynolds  36:07

No, no, no. No one's gonna break their neck. You know, this, that only happens in the movies, the FBI is like, I mean, come help you get your money back or something?


Adi Elliott  36:15

No. So yeah, it's it's, it's, uh, you know, it's just as a side note, I have to observe, it's wild, how sophisticated the some of the the the groups that do these attacks are, in, even in so far as they have, like corporate websites and core values, and do recruiting, and like, you can like, look at their websites. And obviously, they operate and kind of places that aren't under the jurisdiction that the US would cover, but they behave a lot like companies like in a weird way from a recruiting stamp, because they need engineers, and they need a bunch of people to do the stuff that they do. It's wild.


Debbie Reynolds  36:53

Yeah, highly, highly sophisticated.


Adi Elliott  36:56

Uh huh.


Debbie Reynolds  36:56

I think we're, we're naive to think that some kid in a hoodie in his parents basement doing stuff like this, it's like business industry isn't?


Adi Elliott  37:06

Yeah, he's certainly real businesses that and you see, the size is like, a lot of these businesses are as big or bigger as the businesses that they're actually hacking. Like, from a revenue standpoint. Like if you were to look at how much revenue these folks are generating. They're like medium sized businesses often, like they're not like, these are not like tiny little startups anymore.


Debbie Reynolds  37:27

Right? Exactly. And one another thing that people don't understand about breaches is that, you know, the people that are doing these breaches, they're getting together and sort of combining this data together. So they're getting bits and pieces from different places, they're actually coordinating together, it's like, make more fulsome profiles of people.


Adi Elliott  37:48

Yep. And for the I mean, I feel for the insurance companies and a lot of ways because they really, like it's a thing that you need insurance for, but it's a new enough space, that man it is it is a tough thing to insure right now. And I guess, like, in a lot of ways, like we're, like, our software is probably as much of a friend to the insurance company as anything else. Because I think if we can make this is is cost effective as possible that ultimately, they make the total cost of these incidents is less than that, at least, strangely helps them to.


Debbie Reynolds  38:20

Right, right. So it was the world according to Adi, what will be your wish for data privacy, either in technology regulation anywhere in the world?


Adi Elliott  38:36

Hmm, that's, uh, I mean, the, like, the, I guess the the obvious place, like, the optimistic side of me was, I wish the hackers wouldn't hack, and I wish like, we could all just do it. Like, I like, I get the I wish this wasn't necessary. like it as much as I'm glad to be a part of helping with it. Like, as much as people like it. And that's a part of me that that is, is I always need to like, make sure I think about is like thinking through the downsides of this. And that's where I get the optimism of technology is technology is always a thing that, that it's neutral, like technology isn't good or bad. It just is. And then humans come along, and they're going to do whatever they do with it. So I guess in a perfect world, we wouldn't need all this. But in a, like magic wand world. We could neutralize this to such a degree to where it was effectively like how do we make it where someone when someone's hacked, it's almost like spam today. Like where it's like, okay, someone's been hacked, but we can neutralize it immediately. There's that make make the business of it disappear, like make it so it's just not worth it to be in this business.


Debbie Reynolds  39:47

Yeah, I agree with that. Right. So make the in my view in a way sort of make it not like you said not worth it or make them the value of the data. Use Both in a way that is just, you know, it just wouldn't be worth it to do it, I guess.


Adi Elliott  40:07

Yeah. I mean, I like part of me now, when you think of like, like, I mean Spam is like, it's not hacked by any stretch, but it's, it's a similar ish thing. And that it's like, I'm just gonna like there's volume and that people are making money off the fact that even though they're successful, tiny bit, it's enough to like make like to have a business around spamming people. And now it's just kind of attacks on on existing is that we're all just gonna get this spam and enough people are gonna pull down on it to make it pay off to where, I don't know, like, I hope we can get to a place to where the the cat and mouse game of this sort of stuff ends. But I'm skeptical of that now. That's where my optimism knowledgeable.


Debbie Reynolds  40:54

Oh, my goodness. Well, this has been great. I've been so happy to have you on the show. And I love what you guys are doing and the way that you're thinking about this problem. And I know we're gonna hear lots lots more about you and your company, because I feel like, you know, the the end is nowhere in fight. I feel like these types of incidents right now.


Adi Elliott  41:15

Oh, we're very much at the beginning. And it's, it's I mean, it's an incredibly interesting space. So I would argue anyone who is thinking about getting into this sort of space, I would encourage them to do so because it's, it's something to feel good about. But it's also just intellectually incredibly stimulating.


Debbie Reynolds  41:30

Yeah. Excellent. Excellent. Well, we'll definitely talk soon. And this was great. Thank you so much. picks up much. That's awesome.