"The Data Diva" Talks Privacy Podcast

The Data Diva E273 - Kohei Kurihara and Debbie Reynolds

Season 6 Episode 273

Send us a text

In Episode 273 of The Data Diva Talks Privacy Podcast, Debbie Reynolds, The Data Diva, talks with Kohei Kurihara, CEO and Founder of Privacy by Design Lab, about the relationship between privacy, trust, and innovation across Japan and the broader Asia-Pacific region. Kohei shares how his background in startups, blockchain, and digital identity led him to focus on privacy as a foundational element of sustainable technology.

The discussion explores the distinction between security and privacy, including why technical safeguards alone cannot establish trust. Debbie and Kohei examine privacy by design as a proactive discipline, contrasting it with reactive compliance-driven approaches. They discuss why companies that embed privacy early can move faster, innovate responsibly, and build stronger relationships with users rather than slowing progress.

The episode also examines cultural perspectives on privacy in Japan and Asia, including how collective values, family structures, and trust-based relationships influence attitudes toward data sharing. Kohei emphasizes that privacy expectations are shaped by history and culture, and that global frameworks must account for these differences. The conversation reinforces that trust, not compliance alone, is what ultimately determines whether technology is accepted and sustained.

Support the show

Become an insider, join Data Diva Confidential for data strategy and data privacy insights delivered to your inbox.


💡 Receive expert briefings, practical guidance, and exclusive resources designed for leaders shaping the future of data and AI.


👉 Join here:
http://bit.ly/3Jb8S5p

Debbie Reynolds Consulting, LLC



[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:25] Now I have a very special guest all the way from Japan, Kohei Kurihara. He is the CEO and founder of Privacy by Design Lab. Welcome.

[00:37] Kohei Kurihara: Yeah, good to be here. Thank you for inviting me, Debbie. That was really awesome.

[00:42] Debbie Reynolds: Oh yes,

[00:43] well, we've been connected for many years I would say and we, we have crossed paths many times because sometimes I end up doing different knowledge nets and things and in Asia or APAC countries and you know you're, you're very wise.

[01:01] I know we had talked many years ago I think before you even started your company. So I think it's really cool, I think it's really cool that you actually started this company.

[01:10] But I would love to have you tell everyone about your journey,

[01:16] how you got into privacy and how you became the CEO and founder of Privacy by Design Labs.

[01:22] Kohei Kurihara: Yeah, thank you for the introduction of myself. Hi everyone,

[01:27] my name is Kohei Kurihara.

[01:29] I'm running the privacy Management which is the non government organization based in Japan in Tokyo.

[01:38] So explain about a little bit my background history.

[01:42] I started my first company in 2014 which is my turning point,

[01:49] my career because I was being intrigued in my own friends who's coming from overseas to give ideas,

[01:58] to give concepts,

[02:00] to create a new wave of the endorsement of the startup scene.

[02:06] So we started off the new businesses of the crowdfunding platform for content creators, especially for the manga or animations which is becoming popular in other countries. Just regular Asian legions is in really promising the market.

[02:24] So we decided that we didn't have a team to establish the new services for them.

[02:30] Unfortunately the startups in it has always been a brief like the hard time they always. It's not easy to be sustained these services.

[02:39] So we decided to shut down once and I started to work on my own company at the times.

[02:48] So I was very lucky to encounter a lot of my friends from overseas. They always told me that like new technology is like the new innovation is very important for the startup scenes.

[03:04] I was lucky to given the message from the one American founders who works on the blockchain and government technology.

[03:16] So I accepted his invitations to become leaders of the community building here in Japan, especially for the blockchain and governorology.

[03:29] So that's my second turning point to involve the Global themes.

[03:36] So since the becoming of these roles I'm very interested in blockchain technology itself.

[03:43] Then I met with my former co founders to establish the new companies which is focusing on the digital identity and the blockchain. It was very interesting that the blockchain technology can be more secure than protected of the data processing in addition to the authentication processes which may be one of the key issues in the security models.

[04:10] So that was the. I was always been exploring the new things through these startups and spent almost 20 years there.

[04:19] I discovered more important trend which is privacy and data protections.

[04:24] Because blockchain community and engineers especially they are not talking about the data itself, they always talking about the tech.

[04:34] But for the considering of the social implementation it is very important to make mix of integrities of the data and the technology.

[04:44] So that's why I'm the started my new journeys to visit UNESCO that time and came back to Japan and started a new company which is the Privacy Balison Lab.

[04:56] Through that I'm trying to work with the global exports to be connected and the discussion and having a dialogue together what we can do for the futures even though the tech is growing up, but we should balance of the societal norms.

[05:14] So that's my commitment to start of the privacy business. And love I'm quite so interested in the collaborations, innovation, the privacy. So that's my brief history.

[05:25] Debbie Reynolds: That's an excellent history. I didn't know that you had that background in like blockchain and digital identity. Actually I've always thought that blockchain technology is very excellent thing to use for identity.

[05:43] I'm happy to see that you're in that area.

[05:46] The question I have,

[05:48] this is really interesting. So when you said that you had pivoted from the identity blockchain to privacy by design,

[05:55] in my mind I can see how you made that transition. But I think sometimes when people think about blockchain and identity and stuff like that,

[06:05] they do think that that tech solves for privacy if you have security and it does not work that way. So I always say, and I want your thoughts, I always say security is like having a key to a door and privacy is like having a deed to a house.

[06:27] Right? So privacy about a right of someone and a key to a door would probably be helpful, but that doesn't cover the types of protection that you need. But I want your thoughts about how like the security side is different than the privacy side for you.

[06:47] Kohei Kurihara: Yeah, I think it's quite an interesting topic.

[06:51] So in this week I had a interesting conversation with my friends, with my colleagues.

[06:58] So what's the difference of this safety and trust? Right.

[07:03] In case of the safety that people tend to talk about the system framework or standardization.

[07:13] So that's a kind of more systematic dialogues. But in terms of the trust,

[07:18] maybe it's a different direction. That means once you need to make a trust,

[07:24] the system does not fulfill the all the perceptions of what the people think about the trust. Right. In some cases the people tend to believe who might be like good relationship.

[07:38] Right.

[07:39] In case of the human trust, which means the UNME is we've been trusting each other. That's why then we have to build a relationship.

[07:47] So that's not easy to describe what does it mean. But in case of the safety can easy to identify the framework is what the standard is. So we can be sharp.

[07:59] So that's a very interesting topic. So maybe the identity is also very similar.

[08:06] In case of the identity community who involves in security parts, they always talk about what the standardization with the systematic approach,

[08:16] what like a disclosure system authentication. Of course it's important to prove this system might be the safety. But in case of the trust has to be the more diverse, more human centric.

[08:31] Right.

[08:32] So that's quite so different objective. So we have to be integrated of these different contexts to prove what does it mean to each other. Your ideas of the house concept is quite so interesting.

[08:48] And we should talk the different type of the people in terms of this kind of the concept together.

[08:56] Debbie Reynolds: That's fascinating. Thank you for that.

[08:59] I want your thoughts about you mentioned trust. And trust is very important and I thought what you said was spot on, which is trust is about the perception that a person has about a company, how they handle their data.

[09:15] And so sometimes I feel like companies feel like they have done something extraordinary if all they've done is just follow the law. Right. And so the law is like the base, like the bottom,

[09:30] like the lowest level that you can be. Right. And so I think companies are starting to see that they have to do more than that to gain someone's trust. Because they expect just basically that you're going to follow the law.

[09:44] Right. They don't think that you're going to not follow the law.

[09:46] So just following the law is not enough to earn someone's trust. You do have to do these other extra things. And it's something that companies need to embed within their the culture of their company.

[10:00] I love the fact that you named your company Privacy by Design Labs because I feel like some of the and I want your thoughts. I feel like sometimes when.

[10:13] When companies also confuse security with privacy.

[10:17] Sometimes people think about security as like a reactive thing where privacy, I think if you do it well, is very proactive.

[10:26] Kohei Kurihara: So.

[10:26] Debbie Reynolds: So thinking about privacy at the design phase is what companies that are most successful think about privacy at design phase. They don't think about it as an afterthought. But I want your thoughts there.

[10:39] Kohei Kurihara: Yeah, absolutely. Dan,

[10:41] I always been talking with the Dr. Kavikin as she's a creator of the privacy by design.

[10:47] She's always said that privacy by design is quite so interesting a context that she has been created because she always said that privacy is freedom.

[10:57] The freezing is a choice not just only for the users, but also the company as well.

[11:02] So that's quite significant to consider of the proactive approach.

[11:08] It's not a compromise of the security purposes just because the proactive means the companies or enterprises or any kind of the integrities that want to start than positively right and reactive means it's a negative actions.

[11:27] So in that cases the company is trying to comply after the law is coming. Right.

[11:33] In case of the. We had started the privacy law, we have to comply. It's. It's a very active approach. But the proactive approach is creating the market, creating innovations. It's kind of very positive.

[11:46] So we have to change the mindset. It's not to be reactive.

[11:50] We should create the new actions before something is coming. That's very important to create innovations.

[11:57] So it's very important to have the different mindset of the reactive things.

[12:02] So if the companies has the like to advocate of this kind of action so we don't have to pay attention to the privacy right.

[12:12] Every company is in a mindset of the proactive,

[12:16] so nobody don't care about it. So that's very significant.

[12:20] And as I mentioned, trust is very key because trust is relationship with the market, with the consumers, with the users, with the stakeholders.

[12:31] So that's probably important.

[12:33] Every stakeholder has to be proactively to work on this moment. So the privacy of a design is not the.

[12:42] Is the approach to involves in the different stakeholders. To work proactively is not reactive. So that's a very important message I'm always receiving from Dr. Kabukin.

[12:54] So that's our organization is working with different stakeholders, involving and making a dialogue. So that's part of the story to create the new movement together.

[13:06] Debbie Reynolds: That's tremendous. I agree with that. And yeah, she has been on our show as well, so I'm very much a fan of her.

[13:15] So one thing that I hear a lot and I want your thoughts is that if we focus on privacy, somehow that's going to slow down innovation.

[13:28] And I feel like companies that take it seriously,

[13:32] it actually speeds up innovation. But I want your thoughts from your point of view.

[13:38] Kohei Kurihara: Yeah, I think the innovation is not always to harming the people. Right. In the case of the innovation is the usefulness for the user's society.

[13:53] Otherwise innovation does not happen.

[13:56] So going back to some of the innovation in our past society, we have a lot of things just like automobiles, lights,

[14:06] airplanes, a lot of innovation is coming.

[14:09] Of course this innovation is going up once it changes in step by step in case of any kind of harms that happens. That's very important.

[14:20] When we got started new innovation, just like in AI today it's a very big booms, but it's starting to shift not just to portion of the hyvee growth, it's portion of the societal integrities.

[14:35] So that's very significant.

[14:38] So always we have to be embraced mindset. Innovation is always to balance things of the highly gross better society integrations. Otherwise innovation is nothing that happens.

[14:51] So that's very important to be acceptable approach.

[14:54] So the trust is the key to making a dialogue with societies users at different stakeholders to receive the innovation is societal and actions.

[15:05] I think it's in the turning point right now. Not just like expectation is a kind of big contribution to the society. How this new innovative technology. Of course I was in blockchain space.

[15:19] The blockchain was a boom in 2018, 2020.

[15:24] But the thing that's happened is some of the crops of for example, the tech was not perfect.

[15:30] To some people it's a loss of the money.

[15:33] Actually it's not well worked in the society. So that's very important to balances of the trust and the innovation.

[15:44] Debbie Reynolds: That's true. And I think when I say, when people say that privacy gets in the way or slows down innovation,

[15:52] I always say who are you innovating for? You're innovating for people.

[15:56] And so if you're doing something innovation that harms people or harms their trust in what you're doing, then you're who are you innovating for?

[16:05] Like what is the purpose? So that human centric idea is really key, I believe.

[16:12] Tell me a little bit about Japan,

[16:16] tell me a bit about privacy in Asia Pacific.

[16:20] A lot of the, as you know, if we're on you, we're on LinkedIn and different discussions that we have. A lot of talk is about what's happening in The US and what's happening in Europe.

[16:33] And we don't get enough viewpoints from different other jurisdictions. But give me your feel for people who may not know or understand what is the culture in Japan? What is the.

[16:47] What is the ethos or the feeling around the importance of privacy and how does that play into the way that you think about it?

[16:59] Kohei Kurihara: Yeah,

[17:00] that's very important questions.

[17:02] So through my experiences talking with my own friends in different countries, the western friends and then Japanese or Asian folks, I feel it's quite different the context of the establishment of privacy.

[17:19] Because the example in Europe has been a very long history of the human rights, fundamental rights, even the token maybe more than the generation days ago. But here in Japan the first privacy dialogue has been started after World War II and it's a very famous.

[17:38] We talk like after bunker, which is the famous.

[17:42] Maybe it's very similar. It's a kind of EO journalism issues here.

[17:47] So we had a story.

[17:49] It's not long history. It's a very deal new because its premises a bit of the context in falling from the western country.

[17:58] So but more than the privacy, you have to go back to the Japanese cultures. The Japanese structure the society. It's a bit of the different.

[18:07] Of course it's very similar to the eastern Asian countries.

[18:11] So like in Korea and in China we tended to cherish all your families or pedigrees.

[18:20] How the close relationship works. If we build a good relationship, good trust with folks and friends, we share as much as we can. Right.

[18:33] So that's no borders for the privacy. That's very interesting.

[18:38] So what I want to say is the kind of the cultures of individualism and like this kind of the family based concept is a different context. Even we talk about the privacy.

[18:51] Of course the Internet makes us more connected, makes us possible to the seamless of this concept. But still we have the indigenous context in each culture. So that's very important to know what's the difference?

[19:11] What's the cultural difference?

[19:14] So how does it impact to the context of privacy?

[19:19] And of course it's also the same in different Asian countries. So eastern Asia has even eastern Asia like in China, in South Korea and Japan has a different history and they established and even though here in Japan is also the north and south is different.

[19:40] So that's quite so interesting talk. So what's the difference? So we have to compare first not just only talking about the law,

[19:49] also that we have to talk about the cultures.

[19:52] So that's directed impact of the privacy norms each state and how the people Is claiming of the privacy.

[20:01] Actually the Japanese people does not talking about the privacy because the privacy is been very conservative. It's been more negative words in case of the. Like you insist on the privacy to your family.

[20:14] We are family. Why is it need privacy?

[20:17] Right.

[20:18] So in the generation just like the young generation the trainees or teens is also the different type of the context of privacy. So that's very complex.

[20:29] We have to think about the differences. Not only mean in the country rule.

[20:35] It's also the remains of the generation gaps.

[20:38] So that's the. We have the very complex of the matrix and how that works.

[20:43] So that's my favorite dialogues the different type of people here.

[20:49] Debbie Reynolds: That's fascinating that you say that what you said is very important. And I think a lot of people don't think about it. So what you said about the idea of privacy or societal things especially in western culture is very much about individualism.

[21:08] Not really around what's good for the community quote unquote or the society.

[21:17] And I actually noticed that in my travels to different places. How it's like very different. Like people will do things not because they know you. Because they just think it's the best thing for the community that they would do this thing.

[21:31] And so we don't really get that there. But I think that those things do show up in laws and in culture. And I think it's very important that people don't think about privacy the same.

[21:46] So I can't come to Japan and have my own US Western concept and try to force it on you. And that's why I think it's really important to understand those cultural differences.

[21:58] Yeah,

[21:59] very cool.

[22:00] What's happening in the world today in privacy or technology that's concerning you most.

[22:08] Kohei Kurihara: From this year I'm becoming interested in a part of the new norms of the privacy. Because in the national relationship is becoming a change so far before the last year. I guess here in Asian countries also becoming so complicated the relationship each other.

[22:34] Actually we are some hall Asians between the nation to nation level.

[22:41] So which is just originally a new kind of trend.

[22:44] I mean the national security is one of the typical issues typical matters in each regions. And how these countries can cooperate together to share the same value of the privacy.

[22:58] So that's the one thing I'm very interested in that contrastly I speak the national security. The privacy is the one of the very important topics.

[23:09] For example in some countries is a borrow right now. And the use of the high tech. Just like drone or military defense or these kind of physical Actions such as the automobile, B cores or IoT devices and robotics.

[23:27] So maybe it's a next generation of innovation.

[23:31] So these things has to be integrities of the privacy context.

[23:36] Otherwise like our life is always been intrusive by this specific innovation. So that's not been permissive.

[23:46] So we should make an action. So if this kind of the new physical technology come into our life, then how we can survive on the privacy is remaining.

[23:58] So that's a very important topics.

[24:01] So that's. I'm trying to work on this context and how we can protect privacy in the tech and innovations in national security purposes. So that's a new trend.

[24:16] And also it's very important to cooperate with a different country and how we can create better innovations to protect the life.

[24:27] So we should work together. They're starting up and that's very.

[24:32] My curiosity is how it works.

[24:35] Debbie Reynolds: That's fascinating.

[24:37] I had a conversation with someone from Rwanda, Africa recently and we were talking about privacy data protection in Africa. And one of the things that he brought up. And I want your thoughts here, you're the perfect person to ask this question.

[24:49] And we were talking about the idea just like the EU has adequacy,

[24:56] so in order to transfer data, what's word conform to the norms of what the EU wants to be able to get? Adequacy. And then he was talking about moving away from adequacy to thinking about data sharing and frameworks where in his thought it was very funny when he said,

[25:18] we said it, we said adequacy is like a velvet rope, like a secret club that you have to get into where frameworks are more like. And I want your thoughts about the Asia Pacific privacy framework.

[25:31] It's like we are not the same and our laws are different,

[25:36] but when we exchange data,

[25:39] there are certain norms that we want to align with in order to do business with each other. And to me, I think that's probably the perfect way to do a lot of these data transfers because we're not every country.

[25:56] Just like you said, we have a different history,

[25:59] we have different culture,

[26:02] we have different things that we think are important within our society. So there's no way that we're going to be the same right in terms of privacy. So if we think about privacy and data sharing in a framework capacity, we're saying, okay, this is what we want to do and then this is how we can align so that we're not abusing the rights of people in other places.

[26:27] But I want your thoughts.

[26:29] Kohei Kurihara: Actually you pointed out that there Are significant issues right now.

[26:35] We had a talk with some guides and who is involved data transport mechanism. Just like CBPR which is to become global.

[26:45] But the issues remained how the cooperated approach in the different countries.

[26:52] Whether they included reinforcement or maybe just kind of new trend.

[26:59] Of course we should have some challenges in a different level of the resource. The tactics and how we can overcome it. That's also the issue and also the political virus.

[27:11] Even Asian country has a different type of the processes of elections.

[27:19] How the politician is going to make relationship with the country.

[27:23] So these are very important.

[27:26] Not just talking about privacy, but also the law making the processes as well.

[27:30] Right.

[27:31] Even EU just been discussed how to protect still remaining the GDPR or changing the more simplifications.

[27:41] I don't know how the directions maybe directly impact advocacy as well.

[27:46] So this thing is quite so important the privacy context. It's always changing not just remaining one things.

[27:56] For example, like back to the histories of before the Internet the privacy was not the mainstream. The privacy was very minority context.

[28:10] But now everybody is using the Internet to become the privacy subsidious.

[28:17] So this is kind of the tech context directed impact of the data transfer discussion as well.

[28:25] So the businesses become the more complexes.

[28:29] Before the Internet we don't care about where the product is coming.

[28:34] But now it's been like any kind of malware. It's been integrated the IoT device, the smartphones or any kind of the devices that you always use in your life. Then they can easy to track your actions.

[28:52] Not just only for the browsing Internet. You are like the turning on your TV so diffrazier or like you were like taking bathes or any kind of everything is easy to watch.

[29:06] And also the location data might be collected through these physical devices as well.

[29:12] So that's another topic about data transfer your information to the sarcountry. That's important to consider more broader context how these services can be created. Who's involved in a stakeholder. It's a supply chain issues.

[29:31] I think it's very tremendous in next decades of these devices can squeeze more information by the integrating of the AI.

[29:42] That's unavoidable that every consumer has been facing up the issues.

[29:49] In the cases what that's been happens. We have to be imagined.

[29:53] We have to be like back casting how it works right now. How we should make an act right now.

[29:59] Of course the data transfer issue is one of the compliance actions.

[30:04] But this is also the issues of the supply chain. So that's why the point on the national security is the one of the linkage of the privacy in like few years later,

[30:15] in accordance with the surging of the topics of the like the barring of the different country right now.

[30:23] Debbie Reynolds: I agree with that.

[30:24] I think as you were talking I was thinking more about the difference between typical law and regulation and privacy by design.

[30:36] And I'm reminded that when the GDPR came out and they had put the principles of privacy by design in it, like a lot of legal folks I knew were extremely upset because privacy by design is the opposite of the way that they think about regulation.

[30:56] So they think regulation says this bad thing happened to Johnny and so let's pass this law, don't do this because this bad thing will happen to you or something like that.

[31:06] Right. Where privacy by design is saying if you are proactive in the way that you think about harm to people and respect their rights,

[31:17] then you can prevent or lower your risk downstream. And that's so different than looking backwards because that's kind of the way law is in my view. Where you're like okay,

[31:32] you know, and what they. What a lot of people. Another thing that people didn't appreciate about the GDPR is that for example,

[31:43] like purpose limitation.

[31:45] What they really wanted was someone to say delete this data after three years. And they're like, well that is too prescriptive.

[31:53] Where we're saying think about your purpose. And if you think about your purpose, that means you have to have a data strategy.

[32:01] And unfortunately a lot of companies didn't have that. They were just doing whatever they wanted to do and they never thought about a strategic view of privacy. And so when you're talking about privacy by design and figuring out how that weaves in, obviously there's a place for regulation, but regulation does not stop the harm.

[32:26] I don't think of regulation as a shield.

[32:28] Like you can't hold up a sheet of paper and have someone stop, I've abused my privacy because I have a sheet of paper. Right. That's written for a law.

[32:37] So thinking about it in terms of design,

[32:41] I think it's the right approach. But what are your thoughts?

[32:45] Kohei Kurihara: Yeah, I think the regulation is very important to pave the guardrails for the innovations. Right.

[32:56] We cannot say the regulation does not work. The regulation is pretty important to protect our consensus actually.

[33:08] But when it comes to the imaging of the society,

[33:12] regulation is very difficult for them. Right. And very hard to understand.

[33:19] What does it mean of regulations?

[33:22] We always are talking about the regulation by the exports of the regulations. But I often to talk with some non ex parts Right. You can imagine your families, it's oh,

[33:37] they don't know what the legislation means. It's very hard for them.

[33:41] But they have also the privacy.

[33:43] So we should include all these kind of the orcs as the stakeholders of the society. Otherwise the regulation is just for the exports of this consensus. Right.

[34:00] So what I want to say through the privacy design is it's not a talking about we should not talk about the regulation, we should talk about the regulations.

[34:11] But we should have other options who's not able to be involved of the regulation. So that's very significant.

[34:20] Otherwise the people is missing a piece in some cases in the children.

[34:26] Maybe they haven't been seeing what the migration is, what the load is.

[34:31] But they are understanding the step by step in accordance with using of the device, using the services,

[34:38] using of any kind of like encouraging the harms and other things.

[34:44] So it's a kind of learning.

[34:46] But it's very important they are one of the users.

[34:50] So the company has very important responsibilities. What kind of harm could be acceptable in society level? So that's a designing processes.

[35:01] What's the kind of threshold that you can be permissive?

[35:05] Of course in AI some people just might feel AI is very scared, you don't want to use it. But other people it's saying oh AI is very useful, we should use it.

[35:15] So the thing is we should focus is not just talking about regulation, but talking about society alarms.

[35:22] So the by design approach is now by design the regulations of course it's included of the processes of regulation.

[35:30] But besides we should have the talking about the societal norms what could be acceptable and these norms the changing on the current is the societal actions.

[35:41] So that we are working with a different type of the stakeholder together listening voice from the students what they said they are also one of the stakeholders.

[35:55] So it's very important to make social actions. It's not just talking about the regulation.

[36:01] So I think that's a message to advocating of the bodies and approach.

[36:08] So we still remain in the progressions of how this innovation can inclusive approach and these voices from the minority.

[36:18] So we shifted lasing of the pilot and approached as well as the recreation. So yeah, that's why I'm not working on this organization now.

[36:26] Debbie Reynolds: Right,

[36:27] I agree with that. That's fascinating the way that you put that.

[36:31] One of the things.

[36:33] Well, two things I want your thoughts on.

[36:36] One is about artificial intelligence in general.

[36:39] Like all these laws that jurisdictions are passing about artificial intelligence.

[36:45] And then also the fact that artificial intelligence is enabling data creation and data collection that We've never imagined.

[36:55] So that's the reason why a lot of laws,

[36:59] if they're too prescriptive, it doesn't cover some of these other use cases. And I feel like they're with technology that can work very rapidly and is being used for things that could harm people.

[37:12] My concern is that there'll be a situation where there'll be no adequate law or regulation that can stop the harm of someone.

[37:20] But then in general,

[37:22] and I want your thoughts about this,

[37:24] I would rather regulate the data than the technology.

[37:32] So let's say AI is the rage right now and let's say blockchain was the rage a few years ago.

[37:40] If you're regulating the tech, you're always going to be behind because there's going to be a new. There's going to be something new around the corner that's going to be the hot new thing.

[37:49] So if you think about it in my view, about the data and about the people,

[37:55] then you can have something that can withstand these changes in technology.

[38:01] What do you think?

[38:02] Kohei Kurihara: Yeah, that's important topic as well.

[38:06] As I mentioned at first of this interview, the trust is significant which is the key drivers to drive these dialogues.

[38:16] Just because maybe the lot of challenges in AI space.

[38:23] A lot of the model has been surging now and a lot of the countries invest in a huge lot and a lot of like we always been discussed how AI then the AI is going to use for the AI, the stock market.

[38:39] A lot of context it happens.

[38:41] But finally important is how the AI can contribute to a society. So that's the final remarks. Otherwise the people is gonna give it up to use it.

[38:55] So right now it's highly expected that AI will change the world.

[39:01] But AI is,

[39:03] is another innovation. It's been a more societal integration in this stage.

[39:09] So the people should think about it.

[39:11] I was in the same context in the blockchain space.

[39:15] Just a surgeon of the price and a lot of the coins is there.

[39:20] But who is using a blockchain actually. Right. And that's very over demand.

[39:27] So we should think about it actually the whether that AI can create it of a better society for us.

[39:35] For example, like in some article is showing it the AI is losing a job. A lot of the people is not needed to work on this law. The AI will be replacing it.

[39:48] There's some companies been showing up the cutting of the people laws to replace an AI.

[39:54] Of course the AI will be contributed to the processes of the businesses.

[40:00] But how these people can feel about it. Right.

[40:04] So it's quite an Important not just the portion of the innovations,

[40:09] but also how we can create a better society through the AI.

[40:15] So that's important context. We should not talking about the stocking price,

[40:20] we should have talking about how the AI can make us smile or these kind of things.

[40:26] It's a very emotional discussions.

[40:30] But otherwise people is always against the making more strong regulations, the strong guard wills.

[40:39] That's not good for the old stakeholders. Right.

[40:42] They don't want to be degraded and final. They are just partially to give the benefits of the society.

[40:49] So from my point, I guess the AI companies regulators has to pay attention to who can benefit what the society can credit through the AI innovation technology.

[41:03] So otherwise we'll be seeing the similar history that I've been experiencing the blockchain space right now.

[41:11] Debbie Reynolds: Yeah, I agree with that completely.

[41:14] So if it were the world according to you,

[41:18] kohai, and we did everything you said, what will be your wish for privacy anywhere in the world?

[41:24] Whether that be technology,

[41:26] human behavior or regulation?

[41:30] Kohei Kurihara: Yeah, I think it's a turning point. As I mentioned in my first introduction, that's a lot of the discussion. There's some people just said is AI is high to innovation is high.

[41:43] But I don't believe the hype is coming because the hype has been handled by the feedback from the society.

[41:52] In case as I mentioned of the new innovation is harming the people, then people is against the technology. Those are very simple structures.

[42:00] So we should think about is how this innovation technology can contribute to society and involves in the different stakeholders can contribute to this work.

[42:14] So I always been proposing of the most stakeholder approach since that establish these organizations then as much as the stakeholders can involve to the dialogue.

[42:25] I think it's very important not just only for us, but also these kind of grassroots approaches involves in different countries,

[42:35] different regions to work on that.

[42:38] So maybe it's a good timing to reconsider whether the innovation can change or innovation might be disruptive of ourselves.

[42:51] So yeah, that's very crucial to have the different sharing the values together with this kind of interviews or opportunities and different generations, different genders, different nationalities. So yeah, it's good timing to discuss about it together.

[43:09] Debbie Reynolds: Yeah,

[43:10] this is an interesting time to have those discussions about as you said, community,

[43:16] society, culture,

[43:18] the way that we can work together,

[43:20] collaborate,

[43:21] share, not harm people. That's always my concern.

[43:25] So I agree with that and I share your wish for sure.

[43:28] Well, thank you so much for being on the show. This is a pleasure. Thank you for joining from Japan. I know it's early in the morning but, yeah, it's so great to connect with you and I'm happy to have you as part of the privacy community globally.

[43:45] Kohei Kurihara: Yeah. Didi, thank you for running this. Quite a very interesting channel.

[43:48] I'm always happy to contribute in that sense and sharing the value to the folks within this industry together. Thank you.

[43:57] Debbie Reynolds: You. You're welcome. You're welcome. We'll find other ways we can collaborate together in the future.

[44:02] Kohei Kurihara: Yeah, absolutely. Bye.