"The Data Diva" Talks Privacy Podcast

The Data Diva E181 - Dan Caprio and Debbie Reynolds

April 23, 2024 Season 4 Episode 181
"The Data Diva" Talks Privacy Podcast
The Data Diva E181 - Dan Caprio and Debbie Reynolds
Show Notes Transcript

Send us a Text Message.

Debbie Reynolds, “The Data Diva” talks to Dan Caprio, Co-Founder and Chairman of The Providence Group,  Vice Chair, Internet of Things Advisory Board, U.S. Department of Commerce. We discuss and navigate the intricacies of privacy in the digital age, particularly focusing on the Internet of Things (IoT) and Artificial Intelligence (AI). The discussion emphasizes the urgent need for a national strategy for IoT, comprehensive federal privacy laws in the United States, and the significance of the cyber trust mark initiative led by the Federal Communications Commission (FCC) to signify the security and privacy standards of IoT devices, enhancing consumer trust. A pivotal point of the conversation was the importance of implementing a universal opt-out mechanism for IoT devices, offering users more control over their personal data. The episode also explored the dual-edged sword of AI in privacy, highlighting the Biden administration's executive order on AI as a step towards recognizing privacy's critical role in AI governance. Dan Caprio urged privacy professionals to adopt a strategic and proactive stance towards AI governance, emphasizing the dynamic career opportunities in the field and the necessity for ongoing education and adaptation in the face of evolving privacy challenges. This episode offered a rich overview of the current privacy landscape, stressing strategic foresight and proactive governance of emerging technologies as key to navigating the complexities of the digital age while safeguarding consumer privacy and Dan’s hope for Data Privacy in the future.

Many thanks to “The Data Diva” Talks Privacy Podcast “Privacy Champion” MineOS, for sponsoring this episode and supporting the podcast.

With constantly evolving regulatory frameworks and AI systems set to introduce monumental complications, data governance has become an even more difficult challenge. That’s why you need MineOS. The platform helps you control and manage your enterprise data by providing a continuous Single Source of Data Truth. Get yours today with a free personalized demo of MineOS, the industry’s top no-code privacy & data ops solution.

To find out more about MineOS visit their website at https://www.mineos.ai/

Support the Show.

38:35

SUMMARY KEYWORDS

privacy, work, ai, internet, years, advisory board, ftc, european commission, cyber, terms, data, congress, policy, report, board, risk, companies, governance, shiny new object, act

SPEAKERS

Debbie Reynolds, Dan Caprio


Debbie Reynolds

Many thanks to “The Data Diva” Talks Privacy Podcast Privacy Champion, MineOS, for sponsoring this episode and for supporting our podcast. Data governance has become an even more difficult challenge with constantly evolving regulatory frameworks, and AI systems set to introduce monumental complications. That's why I think organizations need MineOS. This platform helps organizations control and manage their enterprise data by providing a continuous, single source of truth. Start today with a free personalized demo of MineOS, the industry's top no-code, privacy, and data ops solution. For more information about MineOS, visit their website at https://www.mineos.ai. Enjoy the show.


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.


Hello, my name is Debbie Reynolds. This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show today, Dan Caprio, who was the Chairman of the Provenance Group and also the Vice Chair of The Internet of Things Advisory Board of the US Department of Commerce. Welcome.


Dan Caprio  00:43

Thanks, Debbie. Thanks for having me.


Debbie Reynolds  00:46

Well, I'm very excited to have you on the show. I've gotten to know you over the time that I've been on this Internet of Things Advisory Board, and you just have such a wealth of knowledge and experience. I would love for you to be able to talk about your career trajectory and privacy; you know where all the bodies are buried, all the stuff about what's happening in the US on privacy. So yeah, introduce yourself.


Dan Caprio  01:15

Thanks, Debbie. My background is in Washington, DC, I started on Capitol Hill, and worked in the House, in the Senate, and then I was at the Commerce Department. It spent a few years working for the State of Illinois in State government. Then, at the mid-career arc, I was working for KPMG in government affairs and was really enjoying all of that. But like a lot of people in privacy, my fork in the road came, this is more than 25 years ago, when my friend, Orson Swindle, who had worked at the Commerce Department, was appointed to the Federal Trade Commission. So Orson asked me to join him, and that was obviously an easy decision. In terms of our careers, we all face that fork in the road. For me, it was leaving lobbying and government affairs per se behind and taking more of a regulatory turn because I realized I really enjoy living in DC, public policy, enjoy connecting people, but that for the long term, this was 1997, privacy and technology policy was going to be really important, and that the best place to be able to learn the craft, if you will, was at the Federal Trade Commission. I was blessed to be with Orson for seven years at the FTC when his term was expiring. I went to the Commerce Department with Secretary Evans, where I was the Deputy Assistant Secretary for Technology Policy and the Chief Privacy Officer, which was the best job ever because I got to think about, and this is in 2005, I got to think about how to promote innovation and protect privacy. So I stayed with the Department of Commerce for a couple of years, left, went out on my own, and started a consulting practice. Spent a few years in law firm life at McKenna, Long, and Aldrich. I really enjoyed that. Not a lawyer, but I really enjoyed my time. But McKenna, then the time came, the second fork in the road. I was enjoying what I was doing in law firm life, but I wanted to do it a little differently. So we founded the Providence Group back in 2015 to really focus on privacy and cyber, or what we now refer to as data risk. So we do that for boards and CEOs and advise on how to assess and govern data risk and future risk in terms of, just to touch on the Internet of Things Advisory Board, the Internet of Things has been a labor of love for me for over 20 years. I really started when I was at the Department of Commerce; my transatlantic focus was the European Commission, and at the time was the Internet of Things. So fast forward to where we are today was honored to be asked to be Vice Chair of the Internet of Things Advisory Board, and it's been a pleasure to work on that and it's been my pleasure to get to know you and to see the contribution that you've made to the Internet of Things Advisory Board in a way that's been very, not always easy, but you've brought a lot of ideas, a lot of enthusiasm, a lot of energy, but you've also been a very good and trusted colleague because you've sought consensus. That's the way that we make policy. Nobody gets all of what they want. But I'm proud to call you a friend, and I've really enjoyed getting to know you. I look forward to working with you in the future.


Debbie Reynolds  05:14

Aw, he's going to make me cry on my podcast. Thanks, thank you so much.


Dan Caprio  05:21

That was not my intent.


Debbie Reynolds  05:24

Well, you've been a tremendous support, and I love your background, I love your story. All the things that you're doing, you bring it all to bear on this work that we're doing together. The Internet, I'll just say for the audience to understand the Internet of Things Advisory Board was an Advisory Board that was, I guess, summoned by the US Department of Commerce. So it consists of 16 people in the US who are selected to be on his board. We're all from different parts, different agencies, different walks of life. We are here to create a report that will be public. The report, once it comes out, will also go to the Federal working group that focuses on the Internet of Things. They will create a report, and when our report comes out, I'll be sure to share it with everyone. I think even though the report is focused on privacy, we have a lot of stuff and privacy in this report. Because as you know, and I think as you were saying about your company, even when you're talking about data, you have to bring privacy into that equation. So being able to have those discussions is very important. One thing I want your thoughts about, or I want you to be able to talk about, is something that I learned from being on the board and talking with you and understanding the lay of the land of the people from outside when they see us, and they think about privacy, and they try to compare with Europe and things like that we're a different country, we have a different history, things are hard to do, especially in privacy. So I think sometimes people from an outside view, they think, hey, why don't we just do the GDPR in the US? What I found is that there are a lot of crevices that have to be filled in, and there are a lot of things that have to be addressed. Before we can do sort of huge, big privacy thing on the Federal level. What are your thoughts?


Dan Caprio  07:54

Well, that's a great question, reflecting just for a minute on Europe and my career. Fortunately, I've been in privacy long enough that I had the good fortune of meeting and actually working with some of the original pioneers, like Alan Weston. What's forgotten, really, that's worth noting about Europe and GDPR, is that the United States and Europe, in the late 60s and early 70s, both started in the same place with privacy and continue to agree. We can have a whole episode on this. But in 1974, we had the passage of the Privacy Act, which is still in force, and has a lot of meaning to those in the Federal government. It really does afford a lot of privacy protections to the public. But that act, as it was about to be passed at the time, was public and private. So it was going to be the whole, all of privacy, well, commercial managed to exempt itself out. So, we have the Privacy Act of 1974. We continued on the road with the Europeans toward the OECD privacy guidelines in 1980, which still serve as well. But it was at that point in the implementation of the OECD privacy, and this is pre European Commission. But 15 years later, that turned into the 1995 Data Directive. That was really the point of departure. But we started in the same place; we share a lot of common values. The European Commission has evolved a lot since the 1995 Data Directive in the direction of GDPR. Now, of course, in the direction of the EU AI Act, Digital Services Act, Digital Marketing Act, and lots of other regulations. But in the US, we've taken, for better or worse by sector, a thorough approach to privacy., and we've said sensitive areas of information need to be protected. So that's medical, financial, and children; I look back on my career at the FTC, then hindsight is always 20-20; you do the best you can with the information that you have. When I started at the FTC, this was pre-97, pre-Google, pre-Facebook. If we could have known then what we know now, there were lots of really strong bipartisan efforts to pass a comprehensive Privacy Bill in the early 2000s. But the decision was made that it was too early, that we needed to give industry a chance to self-regulate. That hasn't worked. The FTC has the FTC Act. That's unfair, deceptive acts and practices in Congress. So, the FTC has to enforce the terms of privacy against that broad statute. So if we could have a mulligan, a do-over to turn the clock back 20 years, knowing what we know now, there would have been a privacy bill. So a lot of the discussion, a lot of what we spent the last 20 years working on with Europeans in terms of cross-border data transfers and Safe Harbor, Privacy Shield, challenges to Schrems One, Schrems Two, we have spent so much time just bending over backwards to be adequate, which is a European term., and we'll see there's going to be a Schrems Three. I think the US government really is and the European Commission too, but the US government really is to be commended for the whole of government approach to the fix for Schrems Two, and the fact that Schrems is not really about privacy; it's more about national security. We've fixed all that. The involvement of the intel community post-Snowden and the cleanup of that mess has really led us to where we are now. So, I think what we have is a very comprehensive tool. We'll see if it works, and I hope it withstands the European Court of Justice. But it's quite remarkable for those that are not in government to appreciate the amount of time and energy that's gone into privacy and all of these adequacy decisions, really for the last 20 years, but especially the last five years, since we brought national security into it and sort of the fix post Schrems and post Snowden.


Debbie Reynolds  12:43

Yeah, that's remarkable work; it's really hard. So I've been on the sidelines looking at that. Developing over the years, I've been working with the Data Directive. I've been working with data even before that came out. So, I think we kind of aligned on that. But I was more in the private sector. So, for people who don't understand, I feel like people just talk about this a lot. Why doesn't the US have a Federal privacy law? I want you to tell me why it is so difficult. So, people don't understand the difficulty of doing this in the US right now.


Dan Caprio  13:19

It's difficult for a couple of reasons. One is jurisdictional. Think of regulated industry and unregulated industry, regulated industry being telcos and ISPs. So, this goes back to all the regulations during the Depression. In the Securities Exchange Commission and Federal Communication Commission, there are a lot of acts in 1934 and 1935. So regulated industry, being the telcos and ISPs, the need for regulation has been understood for a long, long, long time. What we found with the growth of the Internet, or when I started with the FTC, was most of the players in the Internet were unregulated. So they hadn't lived then; they hadn’t lived in a regulated world, and they didn't understand the regulated world. So, a lot of the tech companies were asking at the time for self-regulation, and we did have an approach. We said, as I said earlier, medical, financial, and children, and we had the FTC trying to enforce the FTC Act. It was really simple at the time. It was a simpler time. But we would say that if you make promises, you have to keep them. A lot of arguments about deception. When Tim Muris arrived at the FTC, the conversation began to change a little bit more in terms of how you would enforce unfairness. Tim's view of privacy was that we needed to devote a lot more resources to it, and the FTC needed to bring a lot more enforcement actions, which it did, including the first unfairness case, which Tim and Howard Beals laid the groundwork but was actually brought, I believe when Debbie Majoris was Chair, we had a system that kind of worked at the time, obviously, the velocity of change, we haven't been able to keep up in Washington, there's a lot of different interest around privacy. I mean, a lot of different industries. So a lot of privacy and main committee of jurisdiction has been the Congress Committee, in the House in the Senate, but that doesn't include healthcare or financial services. So I think that's been a challenge. But you've also had the challenge between regulated industry and unregulated industry, which is still mostly the tech companies. The other thing I've learned about being in Washington for so long is passing major legislation, comprehensive legislation like this is hard. I mean, it usually takes two or three Congresses, and the fact that it hasn't moved this Congress is a little surprising, because there had been major movement in the last Congress, and now Cathy McMorris Rodgers is retiring. So, it's a jump ball to see what happens in the next Congress. But I think the other thing that's interesting about it, and this is less answering your question, but sort of looking forward, is that the focus that Congress has paid to AI, rightly so. But AI is not new. You've been involved in machine learning. What's new about it is Large Language Models and the speed at which all of that is developing. But if you look at AI, it exacerbates the problems that we have in privacy. It really argues, I think, for a fresh approach to privacy regulation, and we can talk about that a little bit more. But the easiest way to say it is just I remember working on what became the 1996 Telecom Act, which was three or four Congresses. It was eight years’ worth of work to get to the 1996 Act; you're talking about something that's at least that complicated. With different constituencies and lobbying interests, the stars just haven't really aligned; it's easier to stop things from happening than it is sometimes for things to actually happen. I think we need a fresh look at privacy and the Internet of Things and AI in terms of how we regulate the Internet of Things. Artificial Intelligence, I think, gives us that opportunity to look at legislation more through the prism of data risk. There's been a lot of effort over the last 20 years to look at collection, individual use, and harm, and those things are all important. But I think what we're seeing now is this, this all rolls up to something much, much bigger, which is data risk. You never start with a clean piece of paper. That's what I've learned about the legislative process. But I think it's going to require new leadership and fresh thinking from members in academia and civil society and in industry, and the fact that you mentioned in the beginning about the European Commission, the United States is the last country in the world to not have a privacy law. It's really an impediment in international fora. I think AI is forcing that discussion, but we need to, we can talk about this in the IoT Advisory Board, the report is going to advocate very strongly for a comprehensive Federal privacy law. But I think we're long overdue, but we're going to have to go back and rethink some old assumptions. I think AI gives us a perfect opportunity to do that.


Debbie Reynolds  18:57

Right? I agree. I love it when we talk in a shorthand, because we know all the impediments, all the arguments, all the constraints that we have there. Since you brought up the Internet of Things Advisory Board, I'll just say the typical disclaimer that our words are not those of the board, and if people are interested in that, they can go to the Department of Commerce website to see the meeting notes, all of our meetings are public, all of our discussions and consensus things are public. But I would love to chat about the parts of the report that are privacy-related, that you think are very key, anything that will get people's interest in the work that we've done.


Dan Caprio  19:47

Well, I think you're right; we speak only for ourselves, not on behalf of the board, and our deliberations are public. We have been working hard for over a year; I think we have a meeting next week, which I think is our 11th two-day meeting in just over 12 months, an insane amount of work. But we have posted some draft comments; you can see the contours of where this is headed; we've got some things to still iron out. I don't think those areas are necessarily in privacy and data risk; it is fair to say that, given the Congressional interest in the Internet of Things Federal Advisory Committee was stood up. What we need, and I think what's come out of the advisory board, is something of a reset on the Internet of Things; the Internet of Things has been the shiny new object for the last 20 years, like a lot of shiny new objects, a lot of hype, the Gartner Hype curve, the trough of despair, and interoperability issues and lots and lots of things. But now we've gotten to the point where we're actually into implementation. The data from the Internet of Things really is the data that feeds Artificial Intelligence. So the reset is we need in the US a national strategy on the Internet of Things. So what does that mean? That means that there needs to be more focus on it in Congress, in the executive branch at the Office of Science and Technology Policy, in the same way, we focused on nanotechnology, on quantum, on the Chips Act, there needs to be that similar level of focus on the Internet of Things, because there's all these inexpensive devices that are collecting a lot of data that impact people's privacy and important for the network, a lot of cybersecurity implications. So articulating that we need a national strategy, that the Internet of Things really is part of. The US Office of Science and Technology Policy publishes a critical emerging technology list. I think, in spite of all the hype in the last 20 years, the Internet of Things is still an emerging technology and has lots of implications that need to be addressed. So that's number one. The second part, which we talked about, is we need a comprehensive privacy law, comprehensive Federal privacy law, that includes privacy and cyber for IoT devices. The third thing, which is actually probably the most significant development in the Internet of Things, has been in the works for probably 20 years, but it's the creation of this cyber Trustmark that's been led by the White House, and all the executive branch agencies that are now at the FCC is the program owner to implement rules around the cyber trust mark, which is a voluntary, public-private partnership to give consumers when they buy something off the shelf that the cyber TrustMark says, here's the security, here's what goes into this product. The FCC has it, they've issued a notice of proposed rulemaking; the hope is it’s up and running by the end of the year. That creates, I think, enormous opportunities for international collaboration with the European Commission; there's been a lot of work done on a similar TrustMark in Australia and Singapore. The Internet Governance Forum has paid a lot of attention to it. So the idea of working toward conformity assessment so that the label in Europe and the label in the US don't have to be harmonized, but that they work towards the same end. So, I'm very optimistic about the cyber TrustMark. And I think the Internet of Things Advisory Board will give a very strong thumbs up to continue the public-private partnership, and press on in terms of international cooperation and conformity assessment. That's the biggest deal to me in the Internet of Things in the last 20 years, so I'm very optimistic about that.


Debbie Reynolds  24:21

Yeah, one thing I want to talk about that is going to be part of the report that comes out is Universal Opt-Out. So, I'm excited about this part of the report because there's so much data that's collected, I think, as you were describing the Internet of Things where there's this hype cycle, and people thought it was a big, hot new thing. Obviously, every time in the press, every couple of months, there's a new shiny object that catches people's attention. But while that's been happening, what's happening is that Internet of Things devices are so ubiquitous now that most people don't understand that they have so many more Internet-connected devices in their homes and their offices. Actually, there's a statistic that companies have more Internet of Things devices than they do computers in their organizations. So these are all devices that, if not secured properly, could create a cyber risk layer as a privacy risk there. I think that consumers, whether it's companies buying devices, or individuals buying devices, don't really know what's happening in these devices. So for me, I call Internet of Things devices almost like a computer without a screen. So, it's doing something that you may not know about. Being able to have people have more agency over their choices and not have to do it in a way that makes it difficult for them to make those choices, I think is very key. What are your thoughts?


Dan Caprio  26:00

I agree with you wholeheartedly. I really appreciate your leadership within the Advisory Board on this issue. You have been very tenacious, constructive, and willing to work in a process that's difficult at times. But I think you see the idea, the concept of the opt-out in Federal law, we haven't even talked about State law. But it's a well-recognized concept. The report, when it comes out, coming out in favor that, I think, is significant because, unfortunately, we're not going to have a privacy law this Congress, and there are lots of other reasons, and dysfunction in Congress is high among them, and it's an election year. But I think with that report, and the idea of an IoT opt-out, I think that's a very strong recommendation and something that policymakers should really give a lot of thought and attention to. It's really because of your leadership on the Advisory Board that we've gotten there.


Debbie Reynolds  27:02

Well, thank you so much. I'm from Chicago. So I'm quite tenacious.


Dan Caprio  27:08

You and I figured this out. I lived in Chicago, and I'm a Cubs fan, even though we did win the World Series in 2016, thank goodness, but I'm an eternal optimist. You can't survive in DC for 40 years without being optimistic; as we say, on the North Side, you're a Sox fan, but there's always next year. So there's always next year. I mean, we haven't said that yet because the season hasn't started for the Cubs. But there's always next year when it comes to privacy. I think what's important about our report and a lot of other things, and there's been a lot of really terrific academic work over the years. But the latest thinking by Dan Soloff, on a turning point. So, we need to be optimistic and continue to contribute, and hopefully move Congress in the right direction in terms of comprehensive Federal privacy law.


Debbie Reynolds  28:00

Right? What is your view on the opportunity for privacy because of AI? Obviously, there are risks that are heightened as a result of these different types of data uses. It's hard to get a hold of that. I understand that. But even in the most recent Executive order from the Biden White House on AI, I think privacy was mentioned 30 plus times in there. I think people working in spaces that you work in understand how the underpinnings of privacy can help. These need to be addressed when we move forward into this new AI age. But what are your thoughts?


Dan Caprio  28:46

You mentioned the Biden Executive order. That's another turning point. I alluded to the US government and its response to Snowden and its response to the Schrems European Court of Justice cases, invalidating Safe Harbor, and the creation of Privacy Shield as a whole of government approach. So, the Biden Executive Order is the same thing. The amount of cooperation within the government and all the different timelines and reports that are all in the midst of being written now, and the level of international cooperation in my almost 40 years of tech policy, I've never seen anything like it in terms of bilateral cooperation. All of the meetings that go on are regularly scheduled around cybersecurity, for instance, with the European Commission, multilateral being the OECD, the G7. The fact that all those multilateral fora have taken up AI with the understanding that a legislative solution is going to be more complicated and take more time is very encouraging. So where we go from here, in terms of privacy professionals, there's a huge opportunity, there's also a huge risk. The risk is, and you see this more in Europe, with EU AI, we worked on GDPR for a solid decade before it passed, and now we're a couple of years into the implementation. GDPR talks a lot about a risk-based approach. But one of the things that GDPR never really did was to define what is risk. So what we have is a lot of what's described as risk-based, which really is more compliance. So, in other words, the implementation of GDPR goes so far down in the bowels of the company that it becomes compliance based on the strategic element of how do you use data? What's the data risk at the senior executive level, at the board level, there are some companies that have addressed that, but most have invested just a lot of resources and a lot of time into compliance. So, privacy has gotten the attention that it deserves. But in some ways, it's the wrong attention. I think AI, on the other hand, because there are so many opportunities to use AI to enhance efficiency, staffing, the list goes on and on and on, that companies are beginning to understand that there's got to be a level of compliance in terms of the algorithm bias, transparency,nd for the sake of argument, let's stipulate that most companies want to do the right thing. With AI, nobody intends for their AI or machine learning or facial recognition, the FTC case against Rite Aid. Now, the FTC has reached a settlement with Rite Aid because their facial recognition software was impactful in the wrong way to certain communities. So they've said, if you're RiteAid, you can't use this for five years. It's a wake-up call. But I think most companies have seen the wake-up call and are now beginning to understand they've got to put systems in place around AI. We're okay, up to a point where there's human intervention. But the point where AI is beginning to interact without human intervention is coming a lot more quickly than people think. As a company, CEO, senior leader, or board of directors, you've got to start to think about how you put policies and procedures in place to ensure that you have human intervention and that your AI systems are not just running on their own. This present day, we're talking about something that is upon us now. So the issue I think, for privacy professionals and for all professionals is how do we get to governance of AI that builds upon compliance around bias and transparency, all the things that we have talked about, but then is able to articulate at the CEO level or the board level, and through the organization, that here are the policies and procedures that are formalized, because in the corporate world, we've seen this time and time again over the years, there's a lot of times when new technology comes up, we deal with it with intentionality. But it's almost more ad hoc: form a committee, look at the shiny new object, and report back. So that informal mechanism that's served us so well that has turned into formal governance for lots of other systems and networks is really going to take place very rapidly around AI. I think that's an opportunity for privacy professionals because AI is so interdisciplinary and involves so many different parts of the organization between legal and human resources and the engineering and privacy professionals; we talk a little bit of a whole of government approach when I think we need on the corporate side is we need a whole of company approach. That AI is not something that can just be handled by one stovepipe, you've really got to look at it across the enterprise and figure out what the right governance structure is. I think that we're seeing the beginnings of that now. But that's the exciting opportunity.


Debbie Reynolds  34:30

I agree with that wholeheartedly. So, if it were the world according to you, Dan, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be regulation, technology, or human behavior?


Dan Caprio  34:43

That's a big question. Well, I would start with a comprehensive Federal privacy law. We've needed that since the early 2000s. Without that, we're just handicapped, and I think AI is forcing that, but I think the most important part of this is really the governance challenge. So we're gonna have to roll up our sleeves and figure this out and advise our clients on how systems, networks are in place to, I'm speaking more in a cyber sense, but it's not really a cyber issue per se, but that we have the level of compliance that we need. You build upon that to a governance structure, and we work with outside experts; we've been leaning into NASDAQ, a lot of their work on board governance and AI. My wish is, unlike what happened in cyber and is still happening in cyber; there is no checklist. If there was a checklist, we would have solved cyber a long, long, long time ago; I think my hope is that people will understand this is a dynamic process that involves governance, there are no shortcuts, it's hard, but that if someone tells you there's a checklist to follow, we've been down that road with cyber and in some ways, we're still stuck in cyber because of that thinking. So I think with AI, my hope is that we will think more broadly and dynamically and in a more strategic way, and measure the opportunity, but also the data risk in a way that is meaningful and translates into a policy that can permeate down from the senior level of a company, all the way down into compliance, because that's what we're going to need for AI.


Debbie Reynolds  36:37

I agree. I agree with that. So we have to dispense with the tactics and move forward with a type of strategy that we can bundle into policy and habit. I hate to say the word trickle down, but it has to permeate through all parts of the organization.


Dan Caprio  36:53

One hundred percent.


Debbie Reynolds  36:56

Well, thank you so much, Dan; I'm so happy that we were able to do this session. I'm so proud of the work that we've been able to do together and the board has been able to do on the IoT, the Internet of Things Advisory Board, and you're just so much fun to talk to. Because I love the shorthand and we understand what's happening. This is an exciting time, and it is a turning point, I think in terms of how people are thinking about privacy, especially as it relates to AI. So yeah, we'll talk more for sure.


Dan Caprio  37:26

Thank you for the opportunity, and I look forward to working with you in the future.


Debbie Reynolds  37:30

Yeah, thank you so much and I'll talk to you soon, next week. All right.