"The Data Diva" Talks Privacy Podcast

The Data Diva E156 - Nereida Parks and Debbie Reynolds

October 31, 2023 Season 3 Episode 156
"The Data Diva" Talks Privacy Podcast
The Data Diva E156 - Nereida Parks and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, “The Data Diva” talks to Nereida Parks, a Data Privacy, Information, and Data Governance professional Specializing in AI Ethics and Biometrics. We discuss various topics related to data privacy and data governance, emphasizing the importance of data privacy as an operational issue and not just a legal issue and the need for proper training and processes to prevent internal breaches. We also discuss the challenges and risks associated with AI initiatives, including biases and heightened privacy risks, and the importance of responsibility and ethics in developing and using these technologies. We stress the need for companies to prioritize Data Privacy and invest in proper infrastructure to protect themselves and their customers. We also discuss the importance of good data stewardship and transparency in data collection and deletion and the need to protect people's Data Privacy and prohibit the selling or profiting from people's data. We emphasize the need for consistent policies and laws that encompass all elements of Data Privacy and ethics and the difficulties of navigating different laws in different U.S. states. We also discuss the challenges of data retention and destruction, particularly in relation to Data Privacy issues and the need for proper means to destroy data that no longer has business value. We emphasize the importance of continuous learning and change management in implementing data privacy programs and other new concepts in organizations.

Support the Show.

46:46

SUMMARY KEYWORDS

data, organizations, privacy, ai, law, companies, ensure, technology, ethics, generative, occurs, individuals, legal, agree, biometrics, understand, information, piece, illinois, risk

SPEAKERS

Nereida Parks, Debbie Reynolds


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show today. Her name is Nereida Parks; she is a Data Privacy information and data governance professional specializing in AI ethics and biometrics. Welcome.


Nereida Parks  00:45

Thank you. Nice to be here.Yeah.


Debbie Reynolds  00:49

Well, we have been chatting on LinkedIn; I think we had a couple of calls from networking calls. You actually introduced me to your professor at Northwestern, who was pivotal in the BIPA law; she and I still need to catch up for sure. But you have such an interesting background; you definitely have a passion for privacy, but also the ethical considerations and biometrics. So why don't you tell me about your journey into these areas and why these areas are so much interest to you?


Nereida Parks  01:24

Absolutely. So I'm sure, as my LinkedIn profile probably shows or does show and reflect, that I've been in the healthcare industry my entire career. So I think by degree, I'm an epidemiologist; I have an MPH, but also a molecular biology microbial geneticist. And earlier on, you know, I started my career as with a lot of private sector healthcare professionals in pharma and Big Pharma. And over a period of time, a lot of people take an interest in other areas. And that led me to wanting to kind of explore more of the business side. So back to business school, and after business school, I, you know, was head first at GE into operations and process improvement, getting a black belt, all the things that GE is known for. That journey kind of started with Data Privacy during that time; GE was one of the first organizations way back then to kind of look at data transformation, healthcare transformation, digital transformation, all these transformative areas, and they took a very strategic approach in digitizing a lot of their businesses for operational efficiency purposes, for customer experience purposes. And I was up on the medical device side. And so from that time, as technology has become more savvy, as you probably are already aware, Data Privacy really is an operational situation, and only when there's a cybersecurity issue or some sort of major concern does it funnel up to become a legal issue, but where are these technologies, and AI training algorithms, all these sorts of things that have come to light recently, do these things actually take place on the operational side product management side product managers developing and trying to roll out new products and services. And so that has led me through this journey of certifications, law school, and all these different areas. And so, as we know now, these laws are very prevalent, requirements of them are very prevalent, the ethical components are very prevalent, and that has been my journey.


Debbie Reynolds  03:44

Wow, that's amazing. Oh, my goodness, you cover all the areas. One thing that you said that I love and I agree, and I've been telling people this forever, privacy is an operational issue. I think a lot of people, when they think about privacy, they attack it from a legal lens first, where it just doesn't fully absorb risk to the organization because it's more of like a top-down. And it needs to be more of a bottom-up because companies have data. Companies use data. A lot of the problems that companies have, especially with fines and regulation, is almost entirely about operations. It's not about the legal part. Right?


Nereida Parks  04:27

Absolutely. 


Debbie Reynolds  04:28

That's why companies understand the law.


Nereida Parks  04:30

Absolutely. And I think that is the major area or piece that creates challenges for organizations. And I think they are learning, as with the rest of the world learning about these technologies. Now obviously, you know, data scientists, people like that, we already knew about generative AI and that it was coming and all these different things. Data Privacy is not a new concept at all. But it is, as you mentioned, it is an operational issue when something happens. It's on the operational level. And it only funnels up when there is some sort of legal implication, just like with torts and any other areas of law; when it bubbles up and becomes a problem does it become an organizational issue that funnels to legal, but these things happen in the business. And if I could press upon organizations, that's where their journey with Data Privacy and establishing a program needs to start is making sure that you are categorizing your data, making sure that you have proper gatekeeping of the data stewards, making sure that they are straining majority of data breaches or you know, some sort of misplacement or violation usually occurs internally because people employees don't know oh, well, I wasn't supposed to share that. They didn't mean to do it in most cases, but they do. They do share it. And it's a high percentage, if I'm not mistaken, from law school 85% of these issues. And in this placement and sharing of personal information or personal, certainly, we have people and countries and all these different things externally with cybersecurity, but just on a day-to-day basis, you know, there's a lack of knowledge internally about what these terms mean, how does it impact individuals jobs, and how and what they should be doing amongst themselves as they're operationalizing? And going about their day? How do you operationalize these laws? What should we be doing? What should we not be doing, unfortunately, and I can say this, you know, because of law school, but law attorneys aren't trained to, you know, do day-to-day things; they're there to manage the risks of the company. So they're not going to be available to sit in on a product management meeting, or agile meeting, you know, or looking at Alpha, Beta scale, and lodge toll gates and things of this nature to catch something that actually comes up. So, it is really critical that organizations recognize that the best way to arm themselves is to ensure that the business side is well knowledge and that they have properly trained individuals and Data Privacy, not just attorneys, but business individuals that really understand the implementation and structure and process development that needs to occur in their individual businesses.


Debbie Reynolds  07:28

I agree with that wholeheartedly. And you touched on something here, on quite a few things. But one thing I want to highlight that a lot of companies don't understand, and that is whenever they have Data Privacy problems that, most of the time, they are not malicious, they are inadvertent.


Nereida Parks  07:46

Exactly.


Debbie Reynolds  07:47

So if your focus is on the malicious side of it, you're missing the majority of the problems that people have day to day, right? So if someone posts their password on their monitor, it's not because of malicious intent. It's because, hey, I need to figure out how to remember my password


Nereida Parks  08:08

Exactly. That's just you know, survival, because I don't know about yourself, but I have a million passwords. And it's like, it's just getting more and more confusing to come up with unique ones. But one thing I want to add as well, when you mentioned that this is done inadvertently, a lot of time in organizations is spent on how do we prevent and certainly that is very much an important aspect of this, where I see companies falling short is on the remediation and resiliency plan, as people become more sophisticated to tap into systems and cyber security attacks occur. It's not if it's going to happen; it's when, and it is so critical that organizations spend just as much time on making sure that they have proper processes and guidance when this occurs to quickly ensure that customers are notified and to ensure that are their information, security InfoSec teams and IT teams that they're quickly able to identify and, and deal with these things. And I have been in organizations where breaches have occurred. And it was really a panic attack. As organizations are learning, the focus and the resources are on prevention, and that is important, but it's also about ensuring that you have a well-balanced program that not only focuses on and resources are allocated towards prevention but also the remediation and resiliency after the fact to ensure that everything is secure afterward.


Debbie Reynolds  09:53

Yeah, I agree with that. Because about AI in general, I'm concerned so we know AI is not a new thing. But we see a lot of companies, because of the excitement around AI are really pushing forward with it. They want to get in on the AI Gold Rush, right? So they're really pushing forward with a lot of AI initiatives. And my concern, I guess, has a twofold concern. One is I feel like companies are already struggling with the data they have right now. So to add more complication and more data, I think we'll just make people's jobs more challenging. But then the other thing is, as it relates to privacy, AI really heightens that privacy risk for organizations. So what are your thoughts?


Nereida Parks  10:43

I wholeheartedly agree. With AI, I mean, it's a wonderful capability. It allows, particularly in healthcare, I mean, you're able to speed up diagnosing, you're able to do so many different things that improve operational efficiency, customer satisfaction, and on the healthcare side, just really bringing to light speeding up drug development, diagnosing all these different things that are so important with celebrating advancements, and ensuring that their proper outcomes in health in a lot of different ways. But on the flip side of that, there has to be some level of responsibility. As you are aware, we see case after case or situation after situation where an organization is in trouble for biases, particularly in healthcare; I mean, we already have social determinants of health and underserved communities where certain diseases are more prevalent than others. And then when you layer in algorithms and taking data, and it's whether it's intentional or unintentional, being trained to, I've just read one case, I don't want to put a particular company on blast. But these cases are out there, where the doctors and nurses, the algorithm was gearing them away from minorities and minority treatment in underserved areas. And that's the last thing that needs to happen when these particular individuals need critical care. We see it in hiring practices with one organization; this was really recent; the algorithm realized that they weren't going to be able to train the algorithm to not discriminate against men and women in terms of hiring practices. And so, you know, how do you combat that? Obviously, our government looking at different ways. And one area that's been talked about is posing, you know, registrations and allowing different technologies to be registered, because it'll provide a level of transparency about how it was developed the benefits the risk of it. And that provides better insight into ensuring you're improving compliance, that you're looking at ways to mitigate these risks. But I think another extra layer is providing guidance, tools, and solutions to product teams. I mean, obviously, I have been a product manager and started my journey after business school in product management and then leading business and product management teams. And that's where a lot of these technologies are actually being developed. That's where you're working on solutions. And going through the alpha-beta launch scale process and working with data scientists and engineers, depending on the type of product and solution that it is. And if there isn't guidance, their attorneys, again, don't have the bandwidth or the time to always sit in these meetings, if at all. So providing guidance, tools, and solutions, product managers need to be alerted, and organizations looking at the ethical components of it and ensuring that they have individuals, not just illegal but in the business to ensure that these sorts of considerations are being looked at.


Debbie Reynolds  13:54

Thank you for that. I love that perspective. Let's talk about ethics. So people talk about ethics a lot. Companies talk about ethics a lot. I like to see more ethical action, more action put toward ethics, but what are your thoughts about ethics, especially as it relates to AI and privacy?


Nereida Parks  14:15

I am a big proponent of ethics. You know, as we were speaking earlier, us with networking and me introducing you to my advisor and Law School professor Alexandra Franco who she was not only my advisor but also a professor for biometrics, a lot of my Data Privacy coursework, but mainly my ethics and compliance classes. And it is such a pivotal component. And I would venture to say that it is just as important and it should be coupled very closely with the development of these technologies. I mean, we're hearing so much I think there was an interview right when these generative AI, AI, and ChatGPT came out, the I guess he was the founder of it; his name escapes me. But that interview circled around about, you know, how these different generative AI solutions, they may be able to think and feel and perceive like humans and like people, and that they can, if not contained and controlled, could potentially, these are from people who actually develop these solutions. I'm not a data scientist, so I can't really speak to the development of them. But it is thought that if there aren't safeguards around this, that we could potentially, I guess, like some of the movies that have come out of, you know, robots and these sorts of solutions, actually thinking about thinking humans. So the ethical components definitely are key. And I think that, again, it starts not from the top down or from legal; I mean, obviously, I think it's a two-pronged approach, but not trying to restrict the actual development. I mean, these tools and solutions provide tremendous value, but they're only valuable when they're not hurting or harming or creating biases against groups that they are intended to actually, as a whole, help. And I think that that piece of it is really critical to consider.


Debbie Reynolds  16:26

I agree with that. The thing that concerns me well, the thing that has always concerned me, and that is happening now is that I feel as though some people think AI is magic. And sometimes, they will defer to AI or technology over a human. And that if we abdicate our human judgment to AI, we're in big trouble.


Nereida Parks  16:54

I agree; I think that they are extremely helpful as an accompaniment, maybe with general functions, perhaps they could potentially replace humans. I mean, we're already seeing it now in certain sectors, cabs, 18-wheelers in Texas; I believe they allow AI technology in these 18-wheelers to actually drive without people inside, or maybe there's a person there just to kind of monitor. But this seems to be where things are moving. And while I am certainly not an expert in the transportation sector, but I have read plenty of cases, looking at situations that have occurred, looking at data in terms of the safety profiles, and in certain areas, do I think we will be able to stop that? I don't. But in the healthcare arena, which is my domain, I don't think that AI should replace a doctor. I think that they provide significant benefits and an accompaniment. And I think where the focus should be is to ensure that people understand how to utilize the technology and utilize the technology responsibly in terms of, you know, certain props in the way that you can get valuable information. But in the healthcare sector, I don't think wholeheartedly that a machine should replace diagnosing at all.


Debbie Reynolds  18:30

No, I agree with that as well. What is happening in the world right now that concerns you maybe has a privacy implication?


Nereida Parks  18:42

Well, there's, there's a lot of things; I think the biggest thing that is concerning me is the lack of preparation with organizations not really fully understanding Data Privacy and the lack of investment that's being made to ensure that organizations have the proper infrastructure. I think that organizations recognize, okay, we need something, but they really aren't even clear on what they need and how it needs to be disseminated. And it's a learning, and these are not things that you kind of wait, I know, there was a term that has been used in the industry called blunder funding, where you wait until there's a sanction or a breach or something and then automatically okay, now we will invest in a Data Privacy, but it's kind of like approaching this Russian Roulette where, oh, well, you know, we haven't had a breach or it's not a money generator. So we'll focus on other things, and that piece really does concern me because as technology is moving at the speed of light and not having at least baseline infrastructure, it is putting not only organizations at risk, but it is putting customers and the regular individual at risk for their information being compromised in a certain way because there seems to be this lag or delay with organizations really understanding this. Recognize that they need something, not fully understanding what this is, and not providing the proper guard rails to protect people's privacy. That is what is the most scary to me right now.


Debbie Reynolds  20:26

I agree. I think the way that I have seen executives be taught about risk is okay, well risk is this over there. But let's not really invest money in it until something bad happens. So when something bad happens, we'll will spring into action and start doing things where privacy is something that has to be thought of foundationally and also done in a proactive manner, so. 


Nereida Parks  20:57

Absolutely.


Debbie Reynolds  20:58

You don't get a privacy lens after everything bad happens.


Nereida Parks  21:02

Certainly, and that is the piece that does concern me, this blunder funding, of a we'll fund when a blunder occurs, is really risky. And as we've seen, with big organizations getting major slaps and punishment for it, it's really important because, you know, when we look at privacy laws and different laws, particularly BIPA biometric law, there are various ways in which individuals, they have a private right of action, they have certain implications around that. And when you look at the private right of an individual to be able to file a complaint or do an organization, in most cases, it's not just one person; it's a class action, you are penalized, or damages are awarded based on every violation that occurs. And so when you look at millions of people that can be impacted, particularly at these large organizations, that's where this large sum of money every time a violation occurs, there are penalties that ensue. So I completely agree, it's not something that you want to put in delay, because at that point, there's so much data that organizations have, depending on the type of industry, that they're actually playing in healthcare, there's all kinds of data insurance companies, they collect millions of pieces of data, I mean, from when you went to the doctor, what those labs said, every disease that you have, every doctor, you've seen any and everything that you can imagine about a person is kept. And you think about these large payers and insurance companies; they have hundreds and millions of people that they insure, so you multiply that times the bits of data. And that's how much data is out there. And that piece is really, in my title, when you mentioned that information governance is information. Information can be personal information, any type of information that can take a form of data that is valuable to the organization that they utilize place of value and to make money on. It's really important that those types of information and what reasons it's being used, how valuable it is critical nature of this data, all these things should be categorized accordingly. And that should be the foundation to understand most organizations don't even know what type of data they have. They just know they have lots of data out there. And so yeah, I completely agree that an afterthought. And it's unfortunate that a company has to be punished because once they are punished, there's an intangible hit and tangible hit. I mean, that's your reputation and customer loyalty and customer trust that has been tainted once it does happen.


Debbie Reynolds  24:02

I agree. I agree. Let's talk a bit about BIPA. The BIPA law. I'm from Illinois. So this law interests me a lot.


Nereida Parks  24:11

Oh, yeah.


Debbie Reynolds  24:13

So the BIPA, the Biometric Information Privacy Act, is the most stringent biometric law in the world. It has extracted over a billion dollars in fines and settlements for companies. It's probably the most hated privacy law in the US.


Nereida Parks  24:32

Yes, it is.


Debbie Reynolds  24:35

And it's four pages long.


Nereida Parks  24:38

Exactly. It's very short and concise. And it's interesting because Professor Alexandra Franco, she actually was one of the people who actually drafted this law back in 2008. And so that level of conciseness is how she has managed classes and coursework in law at Northwestern but you As this law is very pivotal, I think it is set the stage for other States to follow suit. And I think, you know, as we begin to look at, like, for instance, CLEAR and organizations using certain what they call biometric identifiers, which are like your retina, iris scans, fingerprints, voiceprint, scans of your hand face geometry, anything that can't be altered or changed with a particular person, these identifiers, there's certain requirements that organizations must follow. And I think I kind of touched on a couple of key things with it. But the major piece is that this particular law allows anyone in the state of Illinois the private right of action, and what that means is a private plaintiff can bring an action based on a public statute, in this case, the BIPA, to file a claim against an organization. But the second piece that I think is most important to bring forth, the highlight is that it's very clear about citing private entity. And so the private entity piece is an individual, it could be a partnership, it could be a corporation, an LLC, and association, or any type of group. But it does not include a State, local governmental agencies, or any court of Illinois, including the judge or clerk; that is important because when you look at one of the biggest state and local governments that collect biometric data, because they have to our police departments, and those typically have been shown there's a lot of controversies, but they're excluded because that is the way that they have to identify and mark and, and registered people who have been arrested and are part of the legal system. So they are excluded from this law.


Debbie Reynolds  26:58

Yeah, I love this law, I have friends who are actively trying to repeal this law, and I'm like, don't hold your breath, it's not going to happen. I think the thing that trips people up about this law is, and it's so different than other laws we see in the US where the US is very consumer focused.


Nereida Parks  27:20

Exactly.


Debbie Reynolds  27:20

And a lot of our regulations, and this is a very human-focused law. So not every human is a consumer, but every human is covered in Illinois under this law, so I don't have to be a consumer of your product to actually make a claim if I feel that my biometrics were used improperly by you. So I think because a lot of legal folks have not been accustomed to laws being like this, this is almost very European, right? Like this human rights element to it. So right. I think that's where people really trip up there. But then at a fundamental level, and you know, I speak all over the world with different companies. And I literally say, the data that someone gives you about themselves doesn't belong to you. It belongs to the person and, I have heard audible gasps in the audience, right? So absolutely. I think that's what VIP was trying to say, like the data of the person, their biometrics belong to them. And they, you have to handle it a certain way, which is very simple. Like, tell them why you're gonna collect the data, tell them how long you're gonna keep the data, and then delete it within a reasonable time like that is so simple.


Nereida Parks  28:36

Exactly that three-year period, exactly. You either want, or you're done with the purpose of why you collected it, or three years, whichever comes first. And you're absolutely correct. And I really think that when you look at the requirements and what companies must do to comply, what you will eventually see, because of all the Data Privacy elements, this overlap of really good policy and good stewardship over data, I mean, it should be made public, you know, there should be a retention schedule, people should understand what you plan to do with it, and when it's going to be destroyed, and what purpose you're going to use it for. That's really, really critical. I think it's just good data stewardship, in general. But I think, more importantly, you have so many different elements now with people selling people's data. And I think that this law really does a great job of outlining. Also, you can't sell, you can't leash, you can't trade, you can't profit with people's data, you can't sell it you have to people have to decide, and you have to make that public and give people a reason or a right to be able to decide with consent if it is okay to do that. And we see that so much in general with your cell phone; someone is collecting your data, and then they sell it for marketing purposes. And I think on the surface of it, you know, I'm sure in a marketing meeting, someone was like, oh, wow, this is a cool idea to be able to collect this data and be able to sell it. But there's people's rights, and things can be done with that data that someone may not want to occur. And in a lot of cases, I mean, just until recently, you did not have this opt-in and opt out, you didn't have that. And so now, you're beginning to see some of those safeguards being put in place. And so I just think, in general, just when you read the law itself, it just following it in general, in a Data Privacy practice is just really good, because it's just good data stewardship.


Debbie Reynolds  30:42

I love your point about data stewardship. And you're right. So the way that companies have traditionally handled data, once they captured it, they did whatever with it. So they didn't really have an obligation to be transparent. But how they were using data, they didn't really have an obligation to delete stuff. So I think data retention or data deletion is very important. Not only does that raise privacy risks, but it also raises the cyber risks that companies have; what are your thoughts?


Nereida Parks  31:14

Absolutely. And when we look at just connecting the dots of our conversation with looking at ethics and AI, and some of the things that the government is talking about, the major point of the whole registration process that's being proposed is because there is a lack of transparency. I mean, there's a lack of transparency with everything. It's not just how these AI and these algorithms are developed and the benefits and the risks. There's a lack of transparency with data collection; and what are you going to use this data for, how's it going to be collected, how's it going to be destroyed? How is it going to be used, and who's going to have access to all of these things overlapping in that same language of transparency is being utilized? And I think that these are elements that if an organization really, again, wants to create a really solid program, from the bottom up, not top up, is to align these consistently so that you don't have gaps because what I feel is as the laws catch up state, and if the federal system decides to also you're seeing a lot of state laws, developing statutes, and acts, and so on, and so forth, but at some point, the federal government is looking into this as well. And I just think that companies would really do themselves a justice to couple these two together, kind of like one without the, you know, not having one without the other because I think from an efficiency from a customer, you know, loyalty and reputation perspective, and then just investment perspective, and ensuring that they have a solid foundation, it's probably to their benefit to ensure that these two things are overlap the AI ethics and responsible development in conjunction with looking at and examining all of these different laws, because before for the longest time, people was I mean, you're talking about 2008. We weren't talking about all of these data issues back in 2008 like we are now. But Illinois was on the forefront of ensuring that it was there, and it had been the lone one for a while. But now you're starting to see numerous states, and there's some that's in the queue. I mean, you've gone from Deepa to now Illinois, Texas has one, New York, Vermont, California, Washington State, Colorado, Connecticut, Utah, Virginia, and it's going to continue to infiltrate the nation. And one thing that I really would hope there's so many different things in the law and how things are constructed, but it's going to become more challenging for organizations; they don't have these guardrails in place to really understand what's going on in these different dates. Not all of them read; like you said, BIPA is the most stringent one of the group, but all of them are different. For my understanding, this private right of action, not all of them have this private right of action; if there is an issue or any sort of complaint raised, it has to go through the State's Attorney General. So at the end of the day, there's a lot of nuances and changes or inconsistencies. They're not consistent laws across the board and organizations; if they're gonna do business in the state, in these different states, they really need to think about how this is constructed and ensuring that it's disseminated throughout the organizations.


Debbie Reynolds  34:42

I agree. So if it were the world, according to you, Nereida, and we did everything that you said, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior, or technology, or even AI ethics?


Nereida Parks  35:01

What I would like to see in the world, if I could have a magic wand, is it would be wonderful if we have one consistent policy that encompassed all of those elements so that everyone could be compliant and not be so well; what does this law say in this state? Well, what does this country say if there was some sort of regulating body that could pass thoughtful and considerate policies and laws and legislation that would apply to every place on the face of the earth unfortunate that it would encompass Data Privacy, would encompass the ethical components of it, because ethics doesn't transcend one sort of state or country, ethics should be universal terms of ensuring that things are done correctly. And if I have one wish, that would be to simplify and make all of these elements a part of one global law that everyone that was mandated across the world, but unfortunately, that's not how that works. And so hence the complication, and hence why companies hire people like ourselves.


Debbie Reynolds  36:13

That's true. That's true. It's definitely complicated. And it's getting even more complicated in the US, as we see in 2024. At least 12 new laws are gonna go into effect, and also some enforcement of some existing laws will go into effect in 2024. So I think the US will be a very interesting place next year.


Nereida Parks  36:36

Where do you see me because I know that with your consulting practice, you work all over. I mean, myself, I've also had global roles, but more confined to an organization, but with yourself, you're on the consulting end of it. What do you see is the biggest challenges globally and even nationally, working in different industries and sectors?


Debbie Reynolds  36:59

I think the biggest challenges globally in different sectors is, I think it has a lot to do with new innovation and technology. So companies are excited about these innovations. They want to move into those new technologies; they want to move into new jurisdictions. But they are cautious, right, and they want to make sure that they're doing the right thing. So it's difficult to know what the right thing to do is, especially as we have so much of a patchwork. So let's say a company is using some type of emerging technology, whether they're building it or they're implementing it. Can I use it in Texas? Can I use it in Illinois? LWhat can I can or can't I do? So, yeah.


Nereida Parks  37:40

Can I do?


Debbie Reynolds  37:41

Absolutely. And then, on the development side, I think the challenge is you want to build something that you can sell. Exactly. So you don't want to create a feature within a tool or product that makes your product unsellable in certain places. So it's just challenging in that way.


Nereida Parks  38:00

Yeah, I think with AI, one of the things that outside of the Data Privacy space that I believe legislation is gonna move towards in terms of examining the best way to manage generative AI. And I think that the legal precedents are gonna move it in that direction is this intersection with intellectual property in AI, because what you're seeing is with cases is are the copyright infringement, you type in something, the authorship and the data ownership, and there's this intersection, looking at where that's going, there's this intersection between generative AI copyright law, fair use doctrine. And there's really these gaps, copyright gaps, copyrights, typically cover expressions, not ideas with generative AI; it's challenging this if you will. And so I think that there's going to be a lot of different, at least in the United States in generative AI is going to force the way the law is looked at in certain situations. And I think that the precedent, that newness of what all of this technology is bringing forth, is gonna force that, and I think outside of Data, Privacy, intellectual property, and copyrights is going to be next.


Debbie Reynolds  39:19

That's a very wise prediction. I'll make another prediction.


Nereida Parks  39:23

Okay.


Debbie Reynolds  39:26

On that. So, because of all this, but you're going to have AI and different uses of AI data lineage is going to become that much more important, where they were basically saying, This is how you should use data. This is how you should protect it. You know, in the future, we'll be asking where did it come from? We're not asking that question right now.


Nereida Parks  39:49

I can't agree with you more. I mean, that is absolutely the case. Because I think that the idea is the more data you have, The richness that comes from that it's kind of like, okay, we're really arriving. But unfortunately, the more data you have, it really does, as we've already discussed, cause a tremendous amount of Data Privacy issues. And I think one thing we didn't we haven't touched on is the whole destroying the data. How long should organizations keep? I mean, obviously, the two main functional areas are HR and legal; legal has to retain certain information for a period of time HR is always having current employees and past employees; how long should we retain their records? And those two departments, in general, have a lot of data. And what do those policies and what does that retention schedule need to look like to get rid of it and the proper means to destroy it, we always hear? Well, once it's out there, you never really, quote-unquote, fully, it's never really fully gone. I'm not at all a data scientist or IT programmer; I know enough to, obviously, do operations. But that is, I think, a question as well as how do you know the source of the data, but how are we getting rid of some of this data that the company actually has? And when is the value actually gone? And how do you actually determine when we will no longer have any value for this data anymore?


Debbie Reynolds  41:31

I love that. I think companies are very good. And they're very focused on the beginning stages of data collection. They're not as good on the retention or data destruction part.


Nereida Parks  41:42

Right.


Debbie Reynolds  41:43

And that is where some of their biggest risks run, especially I tell companies, data that has a low business value has a high privacy risk.


Nereida Parks  41:53

Absolutely.


Debbie Reynolds  41:54

Right, because you shouldn't have it.


Nereida Parks  41:56

Absolutely, because I think some of the more clear-cut areas, being a trained epidemiologist, you need clinical trials, usually, in those instances, have a clear pathway. I mean, because obviously, even with Data Privacy with enrollments, their Data Privacy with the clinical trial sites, and this whole, how are you going to collect it? How are you going to retain it? And so in those specific instances, that part is clear in healthcare, for instance, for for clinical trials, but when you begin to look at further exploration of a drug and future indications, when can you actually let go of some of that data. And it's just fascinating. I'm glad I'm in the area I am. I tell people that to be in a tech field in general, but more specifically, Data Privacy, at this time, where it's just exploding, you have to be intellectually curious. And that curiosity. I'm just really glad that I'm in the space to continue to learn and continue to grow and be a lifelong learner in this space.


Debbie Reynolds  43:02

I agree completely. It's an exciting time. I remember when I set up a Google alert for Data Privacy, you know, did this 10 or 15 years ago? And like, there would be no articles? No, nothing, nothing for years and years for years. And then.


Nereida Parks  43:23

And now you probably get 1,000 hits a day. Yeah. I mean, it's interesting that you mentioned that because recently going to law school, and and going through this process, going through it right at this time where all the information is fresh and new, is so exciting. And I literally block out a couple of hours a day, at the end of the day, or either before my day starts to literally go through and read because there's gonna be some new legislation or something, some development with just the passing of it in a State or some new legal case or something going on that it's amazing. I actually have alerts because, obviously, in law school, you have your Westlaw and all these different, you know, Lexis, and all these different legal sites that you have to use for legal research just for class. But more importantly, you can set up alerts and every day, like they actually have alerts for Data Privacy. And I mean, literally, there could be 10 different things that are posted from these alerts a day, if not more, and then I have additional ones too, and it's just amazing how things have evolved. And I think it's gonna, it's gonna be a continuous just growth because, as we know, this is new. I mean, this is transformative technology. There's new ways that it's going to be applied, and it's going to challenge how we think about things but more importantly in organizations. It's change management; the battle of implementing this is change management. It's changing the way organizations do business. And I think you probably know being a consultant. And I definitely know, working in organizations, that people get very comfortable with the way that they want to do things. Well, we've always done it this way. But this is forcing the changes and the whole psychology part of change management. And you can't really implement a Data Privacy program or ethics and AI, or any of these new concepts without understanding those change management pieces, because it'll be like anything else that you roll out, it won't be successful, because it's changing. First, how people think about business and how they are going to have to approach business and people. As we all know, most people don't like change.


Debbie Reynolds  45:51

That's true. You're absolutely right. You're absolutely right. So thank you so much for being on the show. This is amazing. I love our chats that we have, and I learned so much talking with you as well.


Nereida Parks  46:04

I learned just as much. Thank you so much for inviting me. As always, I listen to all of your podcasts and look forward to some of these predictions, then I'm going to keep track of your predictions and mine, and we'll have to do an update when one of our predictions we'll have to see who's prediction makes it to the forefront first.


Debbie Reynolds  46:24

Absolutely. I totally agree with that. That'd be so cool. Well, thank you so much again, and we'll talk soon.


Nereida Parks  46:30

Absolutely. Have a great rest of your day.


Debbie Reynolds  46:33

Okay, all right, bye-bye.


Nereida Parks  46:34

Bye bye.