"The Data Diva" Talks Privacy Podcast

The Data Diva E272 - Sean Pauzauskie and Debbie Reynolds

Season 6 Episode 273

Send us a text

In Episode 272 of The Data Diva Talks Privacy Podcast, Debbie Reynolds, The Data Diva, talks with Sean Pauzauskie, Medical Director at the Neurorights Foundation, about the emergence of neurorights and why brain data represents one of the most sensitive frontiers in privacy and human rights. Sean explains what neurorights are, how they developed from advances in neurotechnology, and why mental privacy, identity, and free will must be protected as technology becomes capable of reading and influencing brain activity.

Debbie and Sean explore the five core neurorights, including mental privacy, fair access to mental augmentation, personal identity, free will, and freedom from algorithmic bias. They discuss real-world neurotechnology use cases, from medical treatment to consumer wellness devices, and why commercialization increases the urgency of governance. The episode examines risks such as discrimination, surveillance, and misuse of neural data, even in the absence of malicious intent.

The conversation also highlights Colorado’s groundbreaking neural data protections and how state-level action can address human rights gaps left by federal consumer-focused laws. Debbie and Sean discuss why states can serve as laboratories for rights-based protections, how neurorights differ from traditional data privacy, and what policymakers, companies, and individuals should be thinking about as neurotechnology becomes mainstream.

Support the show

Become an insider, join Data Diva Confidential for data strategy and data privacy insights delivered to your inbox.


💡 Receive expert briefings, practical guidance, and exclusive resources designed for leaders shaping the future of data and AI.


👉 Join here:
http://bit.ly/3Jb8S5p

Debbie Reynolds Consulting, LLC



[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:25] Now I have a very special guest on the show,

[00:28] ] Sean Pauzauskie. He is the medical director of the Neuro Rights Foundation. Welcome.

[00:35] Sean Pauzauskie: Yeah, thank you so much for having me.

[00:37] Debbie Reynolds: Well, it was exciting to have you here. You and I had a chat, a call, and I was really fascinated by your background,

[00:45] especially because neural rights is such an emerging area of technology and law.

[00:52] And I thought you'd be the perfect person to be on the show to talk about that. But tell me, first of all, a lot of people, people don't even know what neural rights are,

[01:01] but tell me about your trajectory or work in neural rights and explain to the audience what exactly are neurorites?

[01:09] Sean Pauzauskie: Yeah, absolutely, absolutely.

[01:11] So neurorites are principles that were really, you know, sprung into existence about 10 years ago.

[01:18] You know, at the time there was a funding initiative through the NIH called the BRAIN Initiative, which stands for Brain Research through Advancing Innovative Neurotechnologies.

[01:28] And it was a funding initiative and it's been carried forward now for 10 years. And I think they're upped, you know, 5 billion in funding over that 10 year period,

[01:36] which really jump started this whole area of neurotechnology.

[01:39] It's just kind of across the spectrum and so just kind of broadly, there are two types of neurotechnology.

[01:45] There is neurotechnology that can read brain activity or brain waves mostly is what's out there in the consumer space right now.

[01:53] And then there's neurotechnology that can influence brain activity through different, you know, like electrical, magnetic, ultrasound, like things like that. So those are kind of like the two broad categories.

[02:05] And then obviously you have like your implantable neuralinks and there are a lot of medical applications of implantable neurotechnologies that are just developing in this kind of greater ecosystem that was started by the BRAIN Initiative.

[02:17] So as you know and as your audience know, technology pretty much inherently neutral. When you think of, you know, what, what done you think about nuclear technology,

[02:26] it can power a city, it can cause damage. Genetic technology,

[02:30] it can help people with medical conditions diagnose and treat medical conditions. But you can also do some pretty weird things with it, like create cloned people and things like that.

[02:40] So neurotechnology is no different. And so that's why? In 2017,

[02:44] the scientific community got together at a conference at Columbia University and generated the. What. What are now the five core neuro. What are called neuro rights.

[02:55] Um, and so. So those would be ethical issues that surround the development of neurotechnology. And so just basically, there are five neuro rights that. That came out of that meeting.

[03:06] It was called the Morningside group, led by Dr. Rafael Justa, who was also the lead scientist behind the brain initiative. The first neuro right is mental privacy.

[03:13] Mental privacy means that everyone should have access to and control over their brain information.

[03:19] That information should be kept within the bounds of whatever that person would like to happen.

[03:25] And it should not be used for any kind of nefarious purposes. For example, for targeted advertising or for insurance companies to, you know, discriminate against people based on their brainwaves.

[03:36] It's okay if you want to give it away, but it should be your right. So mental privacy is the first neuro right.

[03:42] The second neuro right is called fair access to mental augmentation. So kind of in that second category of neurotechnology, the type that can influence brain activity, it can do a lot of great things.

[03:53] So even just recently. I'll just give a couple examples. Recently it was shown that using old ultrasound, people's memories could be boosted, you know, which you could imagine could be a big deal for people who have, you know, Alzheimer's, other types of memory loss, things of that nature.

[04:07] And ultrasound was also just, just this week, actually shown to be able to influence learned behavior by stimulating the reward center in the brain, which could be very useful in terms of helping people who have conditions like obsessive compulsive disorder, certain addictions, even anxiety, depression, all these things can be used to influence brain activity.

[04:27] We believe at the neuro rights foundation,

[04:29] everyone should have fair access to the benefits of neurotechnology. Like, there should not be created a kind of disparity in society where some people can access or afford or for whatever reason,

[04:41] have neurotechnology. And so fair access to mental augmentation is the second neuroride.

[04:46] Now, the next two are kind of tied together somewhat,

[04:49] and those are what we call personal identity and free will.

[04:55] So as you could imagine, if neurotechnology can help you get over your addiction or treat your memory loss, it could potentially, potentially be used to stimulate you in ways that might violate your sense of your identity.

[05:05] And we believe that no one should be compelled to undergo, you know, treatments or any uses of neurotechnology that really fundamentally, because this is powerful Stuff, you know, it has the ability to alter your sense of self.

[05:18] And we don't think that should happen. And then kind of concurrently, free will is the fourth neural right. And part of your identity is your will. And no one should have their mind controlled in one way or another.

[05:31] And we could get into that as far as neural data and how it's possible to kind of influence people's free will or manipulate that will in certain ways. But we believe that's the fourth core neural right.

[05:41] And the fifth, it kind of ties back into the first on mental privacy and neural data, in that people should be free from bias. You should never have your brainwave activity looked at as being fundamentally good or bad just based on an algorithm or anything like that.

[05:56] So that's kind of a broad historical perspective. All the way from the Brain Initiative to today, there are about 30 or more consumer companies selling mind reading activity, mind reading neurotechnologies that can measure brain activity today, making it a very much a today problem.

[06:12] But that's kind of the broad perspective and the five core neuro rights.

[06:16] Debbie Reynolds: Excellent, excellent. Thank you for sharing that.

[06:20] I think one of the things that concerns me about neural rights or neural data,

[06:27] and maybe this is the same,

[06:29] I can probably come up with an example of many different emerging technologies, but because this is so personal to the individual, there's always a fear about someone not using it in the way that it was intended.

[06:44] Right. So like you had given an example where someone, if they had some type of like behavioral challenge,

[06:52] that they could possibly use this to correct that or change it or help them in some way, which I think is always great.

[07:00] Right. But I think the challenge that we always have, especially when some of these technologies become commercialized and they go off the path of like maybe a medical use of some sort, they may be manipulated and used in some different way.

[07:15] But I want your thoughts.

[07:17] Sean Pauzauskie: Yeah, absolutely, absolutely. Well, those are all great points. And really I think that the neuro rights movement is kind of born out of a little bit of anxiety when it comes to, you know,

[07:29] and it's not that we have a fundamentally good or bad view. In fact, we have both in that, you know, we should be promoting the beneficial uses that you mentioned at the same time that we should be preventing the nefarious uses, potentially nefarious uses.

[07:43] So as of right now, I don't think that there have been any big, you know, data breaches or things that could be considered nefarious uses of neurotechnology.

[07:53] The Neurorights foundation did generate a Report last year,

[07:56] actually released the same day that the first law in the world actually to protect neural data was passed here in Colorado. Looking at the privacy practices of the 30 companies that were examined, benchmarked against global standards of human rights,

[08:10] privacy protections. And what they found, as you would imagine, is that a lot of these companies,

[08:14] you know, they're startups and they don't have, you know, huge legal budgets to generate privacy agreements. And so there was a lot of kind of just copying pasting from, you know, various legal perspectives.

[08:26] And a lot of them, fortunately, fell short of kind of, you know, guaranteeing the rights of consumers that were that purchasing the products. Now, that's not to say that that was intentional.

[08:36] It's just kind of like this is a kind of a fledgling area of industry,

[08:40] and as the industry grows, we just want to be there to help guide kind of safe and ethical evolution and development of neurotechnology so that it's really helping people more than there's any potential risks because this stuff is so powerful and it can help so many people.

[08:54] I'll just give you one example. Right now I'm wearing a pair of headphones from a company called Neurable, a company that I really like and I think are really great.

[09:03] And it's collecting my brainwave right now and measuring how focused I am. Now, we can do that because we know broadly the brain has about five different categories, or we categorize brainwaves into five different frequencies.

[09:16] So right now it's measuring my brain activity and telling me what percent of my brain activity is in those focused or faster frequencies.

[09:25] Now, I think that's great because I use these headphones when I wake up every morning and write my clinic notes and just throughout the day, and it lets me know when I'm losing focus.

[09:33] You know, you don't always feel that. You don't always feel, feel when your brain activity is getting to that point where, hey, maybe it's time to take a break. You know, just like you can't do pushups all day or you can't run all day, like, you need to take breaks.

[09:46] And so this is smart tech to help you sort of manage the mental load on your brain, which, as you know and as your audience knows, is just becoming more and more as digital technology and information technology sort of penetrates our daily lives.

[09:59] And so just kind of coming back to your point about,

[10:02] you know, the good versus the, you know, what. What could be done right now, I think that there is a lot more good being done, but we just want to be sure that only the good comes out of this really extraordinary movement.

[10:13] Debbie Reynolds: Very good.

[10:14] I want to talk a little bit about Colorado, the neural rights laws you talked about there. And actually neuroral rights is something that I've covered over the years.

[10:23] I'm sure, you know, that Chile was the first country to actually put neural rights in their constitution, which I was very impressed with.

[10:30] But tell me about what's happening in Colorado that people need to know about.

[10:35] Sean Pauzauskie: Yeah, yeah, so. So you're absolutely right. So Chile, I believe in 2021, became the first jurisdiction to basically say, hey, like, we need to do something about this. Like, we need to protect the brain.

[10:47] And so they amended their constitution, you know, a national constitution, which is a big deal. And so that really kind of planted the flag, you know, in South America. And then things have kind of spread from there.

[10:57] The. The state of Rio Grande do Sol in the south of Brazil became the second state in their constitution to do that.

[11:05] And that was I 2023,

[11:07] but kind of fast forward a little bit. So in the fall, I believe, of 2021,

[11:14] I reached out to the Neurorites foundation because I was doing clinical research here in Colorado using a device. Again, I love these devices. I use them in my research, and I think that they're great.

[11:24] But I was having some of that kind of anxiety about, hey, you know, it was a project to monitor patients with epilepsy who are at home,

[11:32] which. Which is great and phenomenal and had never been possible before.

[11:36] But I came across the neur,

[11:38] reached out to them, saw the work that they were doing, and said, hey, I'm just curious, you know, I like neurotechnology. I use it in my research.

[11:44] And would you be interested in doing some work in Colorado?

[11:47] And so about. It took about six months to kind of, you know, get in touch with them, and took another almost a year to really come up with a strategy that we were going to pursue.

[11:57] But what we ended up doing is that I approached a local legislator who is now a senator. She was a House member at the time, and her name is Kathy Kipp.

[12:06] And Kathy is great. Just kind of like one of those powerhouse force of nature, you know, types of. Types of people, very well respected in the legislature. I think she'd been in there for about six years.

[12:15] And I approached her at a local event that was kind of open to the public, and I just kind of went up cold and said, hey, you know, I'm one of your local constituents.

[12:23] I'm a physician, I'm a neurologist. And I'm concerned about this issue of people owning their brain data. And the first words out of her mouth, I kid you not, were.

[12:32] Who would be against that? You know,

[12:34] so at that moment, I kind of knew that we had something like some potential. Yeah, it's kind of one of those. One of the jokes that kind of gets thrown around about this is that it's a no brainer.

[12:45] One of our legislators said that actually at the bill signing, which is just kind of because it is. It's an intuitive.

[12:52] It's your brain, like, who would be against people owning their brain data. So that was kind of the seed of the spirit and the effort. And then it took about another year for us to put a working group together.

[13:02] We got together with the attorney General's office, who were great and integrally involved from the beginning, which they supported, which, you know, it's always great to have your attorney General's office on board.

[13:12] And then we put together a bill. The strategizing took just a little while. We thought at first we'd do a standalone bill, but then what we ultimately arrived at was that there are about 20,

[13:22] actually now about 25 states that have a broad privacy act which protects all kinds of data, you know, that you would consider sensitive and beyond,

[13:30] you know, your bank information, your address, your phone number, your Social Security number, like all these things that you would not want getting in anyone's hands, that you didn't want that information to get in their hands.

[13:41] And so, lo and behold, there was kind of already a framework for us to pursue in terms of amending those laws or that law in Colorado, which Colorado had one,

[13:51] to include the definition of neural data. Now, we could get into the weeds a little bit, a little bit about the definition of neural data. But just suffice to say that we consulted broadly experts, really brought everyone to the table and came up with the best definition of neural data that we could,

[14:07] given the lay of the land and our expertise.

[14:10] And lo and behold,

[14:12] it was a runaway success in the legislature. It was voted unanimously or near unanimously in both houses. There was one person voting no on everything for personal reasons in the House,

[14:23] but it ended up passing the Senate unanimously and went to the governor's desk in April of 2024 and became really the first law with hard law protections for neural,

[14:34] meaning that, you know, now there's kind of like legal framework and some kind of backend enforcement provisions for neural data. And so that's what we did in Colorado.

[14:43] Debbie Reynolds: That is amazing. Oh, my gosh.

[14:46] Thank you so much for Your service, I tell people privacy is an inch by inch battle. And you did it.

[14:54] You definitely did it.

[14:56] Yeah.

[14:56] Sean Pauzauskie: I feel like everything just kind of came together with the right people in the right place at the right time and felt very fortunate,

[15:02] but I think it was for a good cause.

[15:04] Debbie Reynolds: Yeah.

[15:05] So I want to talk a little bit about federal.

[15:09] Federal versus state. Federal versus state. Right. So as you. I'm sure, you know, everyone keeps saying we need a federal privacy legislation,

[15:19] which I don't disagree with.

[15:22] And. But we. What we do see is because there has not been that type of groundswell or movement to actually do that on a federal level, we're seeing the states really take up that challenge.

[15:34] And I think one of the different things about the states,

[15:39] I guess people don't really understand how states and the federal government has rights. The states have rights, and they're not interchangeable. Right. So they're in some ways incongruent, but they're made that way so the states can have their own rights.

[15:53] But one of the tensions that we have on a federal level around privacy a lot of times is based on consumer rights as opposed to human rights. And so I feel like states can do more things that are more targeted to humans as opposed to just consumers.

[16:16] So I saw this in Illinois with the Biometric Information Privacy act,

[16:22] where that.

[16:23] And it's so funny because I have a lot of people who have told me over the years, even like legal scholars, they're like, oh, well, we have a federal law, then it's going to preempt this law.

[16:34] But it can't preempt it because it's actually broader than the consumer protection that you could have at a federal level. And so that's why I think it's really fascinating that you've done this in Colorado, and I'm happy to see that you were able to get this done.

[16:49] Sean Pauzauskie: Yeah. Yeah. Well, thank you so much for that. And I think that's fascinating,

[16:53] your analysis. And I think you're absolutely right that the states are kind of like laboratories of legislation and different things. And I believe in federalism, but this has kind of become that kind of movement where just to kind of frame a little bit further.

[17:07] So this is actually a strategy that's taken off in different states. Actually, Illinois, there's a legislator there named Ann Stava from Downers Grove, and she has introduced a bill to amend the Biometric Information Privacy act to include a definition of neural data.

[17:23] The state.

[17:25] That's a bill that's pending. There are three additional states that have passed Laws modeled off of Colorado's, including California, you know, who's kind of like the biggest and most important privacy act at the state level in the US Currently and probably only second in the world to the GDPR.

[17:41] And so they became the second state in September of 2024. And then Montana, you know, caught wind that things were going on. And if California and Montana can agree on anything,

[17:52] it's kind of a miracle.

[17:53] But they did. And neurorights were passed, Montana unanimously in both houses. And then the state of Connecticut became the first state on the east coast to amend their privacy act to include a definition of neural data.

[18:05] And so in addition to Illinois, we have bills pending or in process and I think about six or seven, maybe up to eight or nine states by now. You know, they're kind of sprouting up gradually.

[18:16] But Alabama is one, the state of Washington. We're in conversations with New York, Delaware,

[18:22] you know, you kind of name it. Neuro rights are kind of taking off at kind of like that Federalist, you know, philosophy. But to your point, you know, we absolutely do need federal standards for this.

[18:31] And that's actually part of the strategy that we're pursuing is to really build that collective momentum and say, hey,

[18:38] kind of we can do this in a patchwork way, but a one size fits all is really going to be the thing that is really going to kind of push things over the finish line in terms of protection both for consumers and hopefully those human rights, you know, can get kind of written in as well.

[18:52] And to that end,

[18:54] wanted to say that actually the three Democratic senators just a couple months ago introduced a bill called the MIND act, or what stands for the Management of Neural Data, the Management of Neural Data act or something like, to that effect, I'm blanking on the acronym just at the moment,

[19:11] but a very important piece of legislation sponsored by Chuck Schumer, Ed Markey and Senator Cantwell from the state of Washington,

[19:19] and that the spirit of that law is to actually get the FTC to look into this and create standards for industry for the management of neural data. And then also this kind of other related data that isn't quite neural data, but it's kind of, it's related to activity of the brain.

[19:34] So. But you're absolutely right that the federal state disparity is that kind of dance that we do, but it can all work together. And I think that's kind of the beauty of our system is that we can have sort of things happening at the state level, things at the federal level,

[19:48] but obviously Once you have the federal traction, the federal momentum,

[19:52] you're way far ahead of having to go to all 50 states.

[19:54] Debbie Reynolds: That's so true. That's, that's very true.

[19:58] Sean Pauzauskie: I just remembered it's the management of individuals neural data. So I apologize.

[20:02] Debbie Reynolds: Okay, good.

[20:03] They're very creative with these acronyms. That was a good one. Yeah.

[20:08] I want your thoughts about,

[20:10] we had talked about neural data like your, your headphones that you use or something like something helping someone with a behavioral challenge.

[20:20] But I want to talk about neural data on a consumer side, like where will people come in contact with technology that on the consumer side, where maybe their brain data is being collected, they may not think about maybe like with a VR virtual reality headset or something.

[20:39] Sean Pauzauskie: Yeah, absolutely.

[20:41] So, yeah. So neural data,

[20:43] we talked about brainwaves earlier.

[20:45] And so the brain is essentially an electricity generating organ that generates oscillating waves that we call brain waves or eeg. I'm sure some of your familiars are, some of your listeners are familiar with EEG as electroencephalography.

[21:00] And so those waves and should mention kind of concurrently that without AI, we are not really able to decode too much from that kind of neural data. Like I use it in the hospital.

[21:12] I can tell you if a patient is having a seizure or might have a seizure, I can tell you if their brain's functioning properly and maybe a few reasons,

[21:19] you know, behind that. But with AI, I mean, we're getting to the point where you can actually decode thoughts. There was a study last year in Australia that showed fidelity of about 40% thought to text just using brainwaves, which if you think about it is kind of phenomenal.

[21:32] You can just think a word and the AI and the brainwaves together generate that word. I mean, it's just incredible what's happening, but kind of to your point about the consumer level.

[21:43] So as I mentioned earlier,

[21:45] there are today about 30 companies selling kind of broadly in the wellness space devices to collect, decode and give you insights into your brain. Kind of like the fitbit of the brain would be one way of thinking of it.

[21:59] But a lot of these are headbands. I mentioned the headphones which were kind of a big development last year. But what most of what they're marketed for is to improve your mood.

[22:08] They can help you meditate, they can help you sleep, and they're are very useful in terms of things like,

[22:15] kind of like the hobbyist community. If you want to do experiments on your brain, if you want to learn how to control your computer with your brain, you can actually do that today.

[22:23] So gaming is another big area of this where to your point about AR VR, it could be actually reading your brainwaves and know in real time kind of what's going on in there as you're playing and give you insights.

[22:35] Kind of going back to the whole how we can kind of tune our brains into different states.

[22:41] But I think the general principle here and the reason that we care about data at all in general is that if you can't track something, you can't change it.

[22:49] So giving people insights into their brains for all these different purposes is really how people are going to encounter them in the consumer space.

[22:56] Apple, for example, has a patent on an EEG AirPod.

[23:00] If anybody out there has the noise canceling AirPod, those are going to have EEG sensors, we believe in the next six,

[23:06] two to five years. It's hard to tell. Apple did just create a template or a platform for medical uses with a company called Synchron to use implantable neurotechnology.

[23:18] So they're already creating kind of the groundwork and the framework to use neural data. So. And as you know, AirPods are 100 million or more sold every year. So this stuff is just, just getting ready to explode exponentially.

[23:32] Another company, Meta, just released an AR VR glasses that can be controlled with a wristband.

[23:38] Believe it or not, you can give commands. And I was actually down in Best Buy in Denver, picked up a pair, excited for my VR glasses that have my wristband where I can give commands and things like that.

[23:49] So this stuff is just going to become a part of daily life soon. But very important that we sort of get out ahead of it to your earlier point and make sure that only the good comes of it.

[23:58] Debbie Reynolds: Yeah, I'm a data person, so the privacy challenges that I see with data in general is about data that is collected,

[24:07] how long is retained and how it's shared. So those are the three big buckets where you get into these privacy issues.

[24:15] And a lot of times,

[24:17] as I'm sure you know,

[24:19] a lot of technology systems are made to remember data and not to forget it.

[24:23] That a lot of times has friction with privacy where they're saying delete or get rid of stuff or de risk it after a certain,

[24:34] not a period of time, but based on its purpose of the user. And so what a lot of times what we see in situations where people's data is being used in a way that they don't like is that the data use somehow veers off away from the benefit to the person.

[24:55] Right. And people feel violated by that.

[24:58] But I just want your thoughts on that schema.

[25:02] Sean Pauzauskie: Yeah, absolutely, absolutely. So I think this is where we sort of get into the area of informed consent. The Neuro Rights foundation, we are actually working right now with industry.

[25:13] We're going to be out at a conference in Asilomar with a group called Brain Mind.

[25:18] And so we are very actively engaged with industry in terms of your.

[25:22] To all those points that you're bringing up about data retention. What's the purpose of the data and what and but kind of most importantly,

[25:30] what does the consumer know or how is this communicated to the consumer about what is the purpose of this data?

[25:37] What can be. Where I kind of get. Get the most,

[25:40] you know, kind of excited her about just kind of trying to make things better is in terms of what can be decoded from the data, you know, letting people know that, you know, here's kind of like the, the 23andMe of your brain.

[25:52] It can tell you a lot about what's going on in your brain and give people that kind of, of framework because we're all ignorant on different subjects. I don't know anything about a lot of stuff and I want to know when I walk into a consumer facing agreement, kind of what's this company planning to do with this or what could they do with this?

[26:09] And so those are all issues that we are integrally working with industry on trying to develop standards, trying to develop privacy policies. We're actually working with a few companies towards an industry standard privacy policy that we hope becomes the privacy policy kind of standard across industry that does embed all of these human rights protections for privacy.

[26:30] So that data is only used for the purposes that it's intended.

[26:35] Debbie Reynolds: Yeah, I've done some work and advisory in the VR XR space and for me, when I was thinking through some of the privacy challenges,

[26:49] I feel like in the future consent will need to be incremental and maybe in phases as opposed to you seeing a block of text of 80 pages and just click a button and you don't know what you said yes to.

[27:06] As you're going through an experience or a journey. I feel like consent will need to be more maybe frag, maybe fragmented sounds like a bad word, but it needs to be broken up in some way and people need to have choices and options at different phases based on how they're using technology.

[27:24] What do you think?

[27:26] Sean Pauzauskie: That's a great point and I actually, I really like that.

[27:29] I totally agree. I mean I think we're all we're all guilty of just kind of clicking. Okay, like I want to use the thing. Here's 80 pages and I don't have three hours to read it.

[27:38] So just kind of that implicit trust that, okay, like nothing bad's going to happen here, but that's actually what the Neuror Rights foundation, the report that we did,

[27:46] was taking all those privacy practices and going through with a fine tooth comb. Very brilliant academic who's actually the executive director now named Steven Damianos, actually took the time to go through and read all of those privacy policies.

[27:59] But that's a fascinating idea that you bring up and I would love to sort of hear more about sort of what the different increments should be. Kind of like, should there be a basic privacy agreement that kind of covers these broad categories and then get more granular?

[28:14] Should it start granular and then get more broad or just. Just kind of. But I love that idea generally and thank you so much for bringing it up because I'd not to this point thought of that.

[28:24] Debbie Reynolds: I guess I imagine that almost like gamification in a way.

[28:28] Just think about if you're in a video game, you have to make choices about where you go, right. And based on the choices that you make,

[28:36] maybe you'll get a prompt or something or some type of message where you can make a choice.

[28:41] Because really I think that's what people are concerned about, where they feel like they get into experiences or they say yes to the 80 pages and maybe there's something that they don't like.

[28:51] They don't know how to say no or they don't know how to back out of things. So I'm happy to talk to you about it. To me, it's fascinating.

[28:59] Sean Pauzauskie: Yeah, yeah. No, I think that's kind of a brilliant strategy because taking pieces is always a little bit easier and maybe easier for people to understand than kind of this big holistic like broad 80 page agreement that we all just kind of click yes and hope for the best.

[29:18] Debbie Reynolds: Yeah, well, I think the technology is there to help us do that.

[29:22] So I hope to see more development in that area.

[29:26] One thing I want to talk to you about is bias.

[29:29] You mentioned that one of the tenants of the Neura Rights foundation is about people being free from bias. And so bias concerns me greatly because obviously we are not the same, people are different.

[29:45] But I am always concerned about the types of inferences that get made about people based on the data about them. Right. And so inferences, unfortunately sometimes those inferences are not apparent.

[30:00] They're not transparent to people and then decisions may be made about them that they may not know or understand.

[30:08] Right. Based on emphasis. But I want your thoughts about bias in that realm.

[30:15] Sean Pauzauskie: Yeah, so I tend to think that this is one. I mean all of the neuro rights are rooted in justice, but I think that the bias neuro right is probably the one that is the most deeply rooted in sort of.

[30:28] That I tend to think of like the sort of midnight in the garden of good and evil,

[30:33] you know, cover the kind of, I'm not a lawyer, but sort of the blind equal weights to everything without any kind of, you know, implicit preference for, you know, one thing or another.

[30:44] And so however those can be written into the algorithms that are interpreting the data and sort of controlling the uses of the data and kind of how things are categorized.

[30:53] I think we should stay agnostic to everything and just kind of take things with a blank slate and just say a brainwave is a brainwave. Everyone's brainwaves are the same.

[31:03] Fundamentally, they're all electrical activity. They're all just oscillations of the same tissue. And so I think that those principles in the future and now, I mean it's really right now we should be working to ensure that the algorithms that are interpreting the data.

[31:19] I think this is an interpretation issue. Whoever's out there kind of building the.

[31:24] And just to be sure that everything remains blind, equally weighted, almost de identified to the nth degree to ensure that no one's personal preference or nobody's biases are in anywhere embedded in the data.

[31:40] Debbie Reynolds: Yeah, that concerns me greatly. So although this is not a neural example, but I've read certain financial institutions that treat customers differently.

[31:53] Like you may get more favorable rates for loans or something because you have iPhone versus Android. Okay,

[32:00] so to me that's bonkers. Like that should not even be allowed, like anywhere, anywhere. So when you have data being collected about people, and maybe the data that's being collected may infer something that may not be true,

[32:18] true about a person. And then you don't want anyone to make a decision that hurts that person.

[32:25] I'll give you another example. So this happened with Cambridge Analytica. So one of the scientists that was on the Cambridge analytica team,

[32:33] he said that they had thing that based on all the psychographic data they had picked up from these surveys that they had people do, they had a thing called the Kit Kat project.

[32:45] And so the Kit Kat project,

[32:47] what they found, it was a correlation,

[32:50] right, that they found that people who liked KitKats who were on Facebook or whatever tend to like anti Semitic messages. Okay.

[33:01] So the problem with that is if you like Kit Kats, are you anti Semite?

[33:08] Right. So that's the problem that we have with inference and that's the problem we have with data where it may be skewing in certain ways, where people may make inferences there.

[33:19] And that's always my concern there.

[33:22] Sean Pauzauskie: Yes. Yeah. We should always be challenging any assumption.

[33:26] How do we ensure that neural data that there aren't any implicit assumptions prior to the analysis or prior to the use or prior to the application. Like this particular pattern could be tied to something.

[33:38] But I think we get into sort of the correlation never proves causation. I mean that's the biggest. No, no, in, in all of science and all of.

[33:48] Yeah, yeah. I mean it's one of those kind of. It's one of those unfortunate parts of human nature. But if you're really serious about how your technology or what you're doing with the data, then you are always trying to eliminate or down regulate any use of implicit correlation with causation.

[34:05] So yeah, I'm in complete agreement like that. That needs to be a big part of this.

[34:10] Debbie Reynolds: I agree completely.

[34:12] Well, Sean, if it were the world according to you and we did everything you said, what would be our wish for privacy and neural data anywhere in the world,

[34:20] whether that be human behavior,

[34:23] technology or regulation?

[34:28] Sean Pauzauskie: Yeah,

[34:28] so this has kind of taken me back to my college days and I think life is kind of funny in that it's difficult to draw the line from what you were thinking about.

[34:38] For me, I won't say exactly, I don't want to date myself too much, but I wrote an essay when I was in college about American privacy. And I guess I have to date myself because it was kind of in the context of the aftermath of 911 and sort of some of the stuff that came out then about different legislation and just kind of different ways that we were trying to,

[34:57] you know, fight and win the war on terror. I mean that's. Those are all great goals, but I don't think that.

[35:03] What is the quote by Benjamin Franklin about sacrificing personal liberties and sort of not deserving either if you're willing to sacrifice liberty for just safety. And so at the end of the day, what I remember writing in that essay and what's kind of come back to me in a lot of this neuro rights work is that we should have absolute freedoms of privacy when and wherever those should be sort of guaranteed fundamental rights.

[35:26] And I would say that for Colorado, I would say that for the U.S. i would say that for the world. The right to be left alone.

[35:32] I think that that was a Supreme Court justice. It may have been Warren. But again, I'm not a legal scholar, so I won't misappropriate a quote. But the right to be left.

[35:41] Al.

[35:43] Yeah. So I think that we start there and guarantee everybody the right to be left alone if they want to be left alone.

[35:49] But in that essay as well,

[35:51] I also talked about the right to have a freedom from privacy.

[35:56] So I think with neural rights, just like with genetic data, just like with a lot of things like it, as long as it's your choice to be sharing information,

[36:05] then that ought to be your choice too. You shouldn't have to stay concealed into this kind of personal sphere. And so I think we ought to have freedoms from privacy the same time.

[36:14] But I believe we start with the freedom of and the right to be left alone and then give people the right to kind of gently or whenever they would like, just like with genetic data.

[36:24] I'll just be transparent. I shared my genetic data with 23andMe. Does it make me nervous? Maybe a little bit in terms of, like, what they could figure out about me with my health.

[36:33] But I feel like it could possibly drive medical research forward.

[36:37] And at the end of the day, I'm okay with that as long as it's not coming back to me in some way, like I'm being discriminated against by insurance companies or things of that nature.

[36:45] And I think the same should apply to neural data.

[36:48] Debbie Reynolds: I think so.

[36:50] Well, you're in a fascinating area of work, and thank you again for all your hard work in this area. I think we're.

[36:58] A lot of people will probably follow your blueprint, what you're doing to try to get some of these types of laws passed and on the books in different states.

[37:07] And yeah, I'm just interested to see how things play out. Obviously, it'll become a much bigger issue, especially as more consumer products come out where there's more brain data being collected.

[37:20] But yeah, we'll just see how things turn out. It'll be interesting.

[37:24] Sean Pauzauskie: Yeah. Yeah. I feel like I'm never one to count chickens, but I feel like we have a chance. That's. That's all we can ask for, is that we have a chance to get out ahead of this and be sure that only the good comes out of the neurotechnology revolution.

[37:40] I'm just so thankful to be having these types of conversations.

[37:43] I am trying to make this stuff a little bit more accessible in that I've written a novel or I'm writing a neurotechnology trilogy. I just kind of want to plug that for listeners.

[37:51] It's called the Thomas mariner neurotechnology trilogy. And the first installment just came out in October. It's called stage of fools.

[37:58] And it basically looks at this notion of,

[38:01] as you could imagine, it's very neurotechnology centric,

[38:04] kind of that fair access to augmentation.

[38:06] The character is in a coma. We've learned in the past 10 years that people are a lot more sentient in a comatose state when you talk about neural data. Like maybe someday in the future we'll be able to decode thoughts of people and talk to people in comas and all these amazing things.

[38:21] So the novel does play on that. But I feel like as long as we continue these types of conversations and stay, you know, accessible to each other and stay civil,

[38:32] be sure that all the voices are heard and everyone's at the table. And I think we have a chance to get this right and be sure that the neurotechnology revolution doesn't experience any of the kind of things we've seen in the past with like, nuclear stuff.

[38:44] And we certainly don't want any chernobyls or three mile islands or kind of the sensational stuff about cloning and stuff like that.

[38:52] That's my hope. And my goal is just that, you know, in five, ten years from now, we've got all these beautiful tools that allow us to understand our brains and kind of just live our best lives, you know, because that's everything that we are is in our brain and in our data.

[39:04] And I'm just so appreciative for people who are, you know, out there leading the conversations like yourself and giving us this chance to get it right.

[39:13] Debbie Reynolds: Well, it's my pleasure. Definitely my pleasure. So how should people support you? Get in contact with you? With the neuror rights foundation?

[39:22] Sean Pauzauskie: Yeah. So we do have a website which has an email,

[39:26] you know, contact.

[39:28] So if anybody's interested in getting involved. You know, as we mentioned, we do have a federal bill, but, you know, if we know anything about federal, it's that it's slow moving and it could be, you know, who knows how long.

[39:38] But, you know, right now we're not talking about, you know, 40 states with bills for neuro rights. We're talking about maybe, you know, a dozen or 15 or. But we're still growing and very much interested in collaborating with people.

[39:50] So if you're in a state, even if you don't have a privacy law and you want to get involved, please do reach out to us. And we would love to connect with you and connect with your legislator and just kind of build this momentum as a coalition across the country so that the website would be the best way to reach out.

[40:07] Debbie Reynolds: Fantastic. Thank you so much.

[40:09] Well, I really appreciate you being here. This is fascinating. And I really support your work and applaud you for all the things that you're doing. So, yeah. Hopefully we can find ways we can collaborate and support future.

[40:21] Sean Pauzauskie: Yeah, Definitely would love to. Thank you so much, Debbie. This has been a great conversation.

[40:25] Debbie Reynolds: Excellent. Well, we'll talk soon. Thank you.

[40:27] Sean Pauzauskie: All right. Yeah. Thank you.