Debbie Reynolds “The Data Diva” talks to Takaya Terakawa, CEO, Technica-Zen, K.K., Japan, and Author. We discuss the Asian concept of privacy, the advantage of other professional experiences prior to a career in privacy, the subject of his forthcoming book, the history of the PIPL in China and privacy in the Asia Pacific regions, differences between human and consumer rights in Asia, punishment for cybercrimes resulting in criminal penalties, the harm caused by data misuse, his privacy concerns with emerging technology, the potential impact of lack of privacy in the future, the risks vs rewards of technology, the regulatory focus on harm verse specific technologies, potentially more stringent privacy laws, and his hope for Data Privacy in the future.
Support the show
privacy, people, data, japan, laws, localization, criminal penalty, china, world, technology, happening, harm, business, country, regulations, data protection, europe, government, company, philippines
Debbie Reynolds, Takaya Terakawa
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a special guest on the show today, all the way from Japan. He is Takaya Terakawa, the CEO of Technica-Zen, K.K. Welcome.
Takaya Terakawa 00:40
Thank you very much. I'm very honored. I'm very excited to talk with you, Debbie.
Debbie Reynolds 00:46
Yeah, this is great. So we got to know each other; we met on a panel that we did together for IAAP in the Philippines. And I loved your point of view, your worldview, and your perspective, and I wanted to make sure that you got on the show because I feel like a lot of times when we're talking about privacy regulation, it’s very European-centric. And it doesn't really highlight, in my view, very strong data protection laws and regulations that are in Asia. So you know, some people, because we talk so much about Europe, people don't even know that, for example, the Philippines has had the impact tax regulations for you decades as has Japan and South Korea, so I would love to get your perspective on what's happening in that part of the world. But before we get into that, I would love to have you tell me how you got interested in privacy? I think it would be fascinating for the people to know.
Takaya Terakawa 02:02
Of course. So yeah, first of all, the meeting we met at was really fun. It lasted around three hours, and you remember actually I was talking with Raymond, the previous Commissioner. And so we are, you know, just talking about how we can promote the ACM privacy environment more, because, as you just mentioned, you know, people always talk about the Western concept of privacy, but actually, Asia, APAC has a very, very deep privacy concept and the deep privacy interactions. So we were very happy to, you know, promote that. And thank you very much, again, for inviting me to this. So, speaking on me can enter into this privacy fear that in 2017, it was just before the GDPR. So, I was originally a risk management consultant and the CIO at that time; I had just started my own company. So I was thinking of what kind of business will be successful for my company, so I tried everything at the time. So I did, you know, machinery, safety ideas, I, you know, I suddenly, I encountered DPR. So, maybe, you know, Japan is a very, very, you know, closed company or country, and so they're not good at overseeing the regulations or load. So, that's why when I saw that GDPR, this would be the pain point for Japanese companies. That's how I came into the privacy. So I studied the, you know, GDPR through a PPS training, and I leveraged my experience from risk management, and I also, you know, learned the CAPM privacy management program. And I started doing consultancy for companies in Japan.
Debbie Reynolds 04:25
I think it's fascinating to see people like you who come into privacy from different areas, so not necessarily law because I'm a technologist, too. So I think that people of all types in all industries can benefit from having some type of flip into privacy. So you turn it into a successful business. That's great. You mentioned you're finishing up a book. And I'd love for you to touch base and let us know what this book is about.
Takaya Terakawa 05:07
Oh, thank you for mentioning it, and yeah, it's on PIPL, Chinese Personal Information Protection. I went through all the data related to a lot recently established in China, then interpreted it into Japanese, and the ad is, you know, for nations on it. So your privacy law, but it's usually the so it's a cross ADH, due to the cybersecurity and it's closely related to the, you know, data strategy. So, China has been working on it for 20 years to establish the strategy as a country. And there is a very, very clear focus. So I want you to, you know, highlight that context so that people can understand the Chinese privacy law more deeply. So that's what I deal with in the book. And that's going to be a very good book, I believe.
Debbie Reynolds 06:18
People, I feel like people don't really understand the landscape of data protection in Asia. So I think a lot of times, when I talk to people, first of all, before the PIPL, they didn't talk about China at all. They rarely talked about Japan, and they stopped occasionally you hear something about South Korea and the Philippines; done a really great job, especially with Raymond, right? You know, they have put out a lot of information to educate the public about data protection and stuff like that. But for people who don't understand what is going on and what's happening and Asia now, in addition to the PIPL, maybe people need to understand about data protection in that region.
Takaya Terakawa 07:12
That's a very good question for actually, you know, when you say, AIPAC, you know, it's not one region. So we are, we have separate cultures, are separate countries, and we have a different concept of the, you know, world. So, there are so many countries over there, and China, Korea, Japan, everybody's different. So we are doing, you know, many things differently. So, yeah, it's quite understandable that you lose sight of what's going on in the Asia Pacific. Actually, I might have I don't understand the theory about the Asia Pacific, but then, you know, the fact is, you know, we are trying to modernize as a privacy role a lot. Because of the, you know, digital world, the digital economy is progressing. In Asia, especially, you know, most of the Asian countries that started using the Internet for a mobile phone, it's quite different from the context of, you know, the United States or Europe. Any of them study using a mobile phone and many more and more information is coming to the, you know, cyberspace. So, you know, small devices. So, that's why we are doing more aggressively, you know, focusing on how to manage the data or protect people in Asia. And so, since this concept is quite new, so we usually look at the GDPR or, you know, OECD guidelines or other principles, so based on that service, the template kind of templates and the study of creating our own, you know, our own privacy laws. So, after, it looks like that, similar to GDPR or some other you know laws, but there are very, very different points for each of the laws in Asia. That's what's happening here.
Debbie Reynolds 09:54
Okay, great. Let's see; I wanted to get a feel for the foundations or some fundamentals about data protection or Data Privacy laws. So, for example, in Europe, their privacy is a fundamental human right, right? And where the US, it’s a consumer, right? And I feel like the PIPL is more of a consumer right as well. And then maybe the Philippines; I feel like there may be more of a human right. So what the, to me, those are very foundational differences. What are your thoughts about that?
Takaya Terakawa 10:38
The point, and I agree with that, but in the case of Japan or Korea, so, yeah, we do talk about the fundamental rights of people. But so often, they are driven by businesses. So when you see when you look at them, you know, Singapore privacy law, so it's quite a business variable, your main menu of ADM laws, in my perspective, business-driven, so they want to, you know, create an environment, how to make the business easier for companies is his country?
Debbie Reynolds 11:37
And what are your thoughts about what I'm seeing and in different countries has been the wave that I've known for a long time, right? In China, you know, a lot about that, since you wrote a book about it, but what do you think about countries putting things in their laws related to data localization? So, what is your thought about the reason for why that is,
Takaya Terakawa 12:06
Okay. So, data localization is, in my opinion, either a natural consequence of the, you know, data economy because sometimes the data is critical for defending the nation. So, you know, for example, geolocation information is critical when it comes to a military operation. So, you need to protect the, you know, important infrastructures, when it comes to, you know, some abnormal happening. So, people need to, you know, countries need to protect that information. So, before data is secured inside the nation, and the, so, it was easier for countries to protect information, but nowadays, it's much easier to throw data outside of the country, and it's confidential information easily going out. So I can understand the feeling. So the nations want to secure the data inside it. And the fact that I see that, you know, China has a data localization and the, you know, Vietnam has a data localization, and India may be a very, we have a data localization in the future, I believe, and so this kind of tendency is spreading all over the world. And, you know, Europe also, I think, has a tendency to do that data localization by unrolling the Privacy Shield, essentially, very difficult to send, you know, personal information to the United States from Europe now. So it's kind of a well trained.
Debbie Reynolds 14:04
Yeah, I think so. It's really interesting. So every country or different countries take a different approach to it. India, I just happen to know this about India, but India has had very strict localization laws for decades around banking. So there's certain banking information you can't get out of India. And then, as technology has grown over the years, they're starting to create more regulation around that. You know, like certain financial transactions, all parts of a transaction, certain financial transactions have to happen in India, so it can't be servers and other countries and transferring data back, and I know a lot of some business people have been really upset about that, but I don't know I feel like I don't know in some ways, I feel like is part of it is also an economic driver, right, where some countries I think, want to keep the jobs in the country, or they want to make sure that they're talking with people or working with people understanding the local customs and what's expected of them, what are your thoughts about that?
Takaya Terakawa 15:19
Well, I think, you know, data can be accessed from everywhere. So, you know, depending on, you know, so the, whenever you put the, you know, store the data, you can access it. So, I think putting the data locally doesn't make a big difference for running a business, in my opinion, but then, in Japan, so, so speaking of the data localization, so, in Japan, we had a very interesting case last year. So, one communication company called Lion outsourced the data analysis in China with a Chinese company, and the PPC, the Japan supervisory authority, you know, warned about it, they that PPC gave an instruction to the company, communication company, because they outsource the, you know, personal information processing to the Chinese company without the good security measurement. So, this is happening. So, I think this is more likely, you know, governmental action, and the more, you know, businesses want to spread the data, because sometimes it is more cost-efficient, and the, you know, more, more, you know, beneficial for the, you know, companies from time to time, but it is government, in my opinion, who tried to restrict the storage of data outside of the country.
Debbie Reynolds 17:18
Some ways about people when they do think about doing like data localization, to me somehow seems to be related to people thinking, if I keep it local, is safer in some way? And I'm not sure; I'm not.
Takaya Terakawa 17:37
Sure, you know, did, you know, you can move everywhere?
Debbie Reynolds 17:40
Yeah. Localization falls, that I don't think makes it less or more safe as a result of that. But maybe that harkens back to in the past the way data that let's say you have paper or physical objects, right, that you didn't move, you put it in a locked room or something. So maybe localization is like a digital lock room in a way around that.
Takaya Terakawa 18:07
Yeah, at least for the government, you know, they can access the data center directory, and then, you know, do some investigation in detail at the site. So maybe that benefit for the government, you may be feeling? Yeah.
Debbie Reynolds 18:26
So one thing that I'm noticing in places like Australia, China, and India, is some of their laws around data protection; if businesses run afoul of some of these laws, they have criminal penalties. So that's something we're not yet seeing in the US or Europe. There are a lot of cases being filed and things going to court and stuff like that, but what are your thoughts about the genesis of penalties for your data protection officer, you do something wrong, you may end up with a criminal penalty or something.
Takaya Terakawa 19:10
Actually, you know, Japan also has a criminal act in it and the wrongdoing, you know, fine or you know, sometimes imprisonment and yeah, but it's a quite a, you know, easier way to regulate people. So, yeah, actually Asian country that has a, you know, commonly have a criminal, you know, condition with the privacy law. It's very interesting to find out, yeah, I don't know that maybe that is the hotel we're at.
Debbie Reynolds 19:58
Other countries? Yeah, maybe because I saw a case. This was in China actually, many years ago, where a woman wanted to go to college, and she had saved her money. You remember this case, right? Yeah. She saved her money to pay for college, or she was going to transfer the money. And she got an email, and they told her to send it to a different account. And it was actually a cybercriminal that wiped out her account. And she was so grieving about this whole thing that she passed away. I think like three days later, she died. And the cyber criminals who did this went to jail for murder, right?
Takaya Terakawa 20:46
Yes. Yeah. That's a quite big action. That's big news. And it was impressive to see that, but then, you know, yeah, you know, at that time, China didn't have a lot of data protection, and also their, you know, cybersecurity laws were weak. So, you know, the timing, you know, going on the periphery of the China mood to regulate that incident. Because it was a, you know, big, big issue in cyberspace, in China. And many people are focusing on the news. And the, so the government had to work on it. And the easiest way for them, in my opinion, was using the police to regulate it.
Debbie Reynolds 21:43
Yeah. Well, that to me was interesting because when you mishandle data of people, you can harm them. Right? And so, different countries have different notions of what harm is, right? So if they had physically done something to her, I think all countries have laws around that, but if the harm is in this way, is very different, right. So they say this is the direct result of this crime, this person is now deceased, and we think that you are responsible, so then there is a criminal penalty. So, I thought this is interesting because I think there is harm as grievous harm that can happen to people with their when data is misused.
Takaya Terakawa 22:34
Yes, exactly. Yeah. So, data misuse usually creates privacy harm, and then we need to remember about it because, you know, we often talk about privacy, the importance or, you know, the personal information must be protected, but we often forget why it is important, or, you know, why we should protect it? And yeah, I often, you know, talk to my clients about the consequences of the, you know, data breach or so, privacy harms, and the, so I always asked them, you know, if this happens, are you happy? If this happens to your family, or your brothers or your wife or the children, so can you take it or not? And then this has them thinking, you know, thinking see, yeah, it's really important to think over the privacy. Yeah, I totally agree with you.
Debbie Reynolds 23:34
Yeah, that's fascinating. Tell me what's happening in technology now or going forward that concerns you most as it relates to privacy.
Takaya Terakawa 23:47
in Japan, so, facial recognition is popular in Japan, and the, you know, the press is actively using and the, you know, the government. As you may know, Japan is a little behind in this digitalization, and the Japanese government wants to promote the digitalization of the economy a lot. And the, you know, my biggest concern with Japan's situation right now is Japanese government recently released that saying that they're going to correct all the educational debt are the people in Japan and the track create for the entire life so that they can provide the best learning opportunities to the people in Japan, but I think what that none of your business, first of all, you know, Can we really trust the government? In a way, they handle our debt. Right? Yeah, it's really, really scary, you know, the decision of the government these days?
Debbie Reynolds 25:09
Yeah. Wow, oh my goodness, I've never heard of that. That is scary. I mean, especially your younger, Lord knows what to do, you know, back then you don't want someone to hold something that gets you that you did, especially when you're a younger age. And then too, I feel like when you get so much data, you may see patterns and things that aren't true or aren't right, that people may make an inference about, you know, say, Oh, well, because this person took this class, we think they're going to be a criminal, so we're not going to send them to a good school, or they can't get this job because that's what the data told us. So I'm concerned about that as well.
Takaya Terakawa 25:53
Yeah, you know, the biggest issue with the, you know, digital world and the, you know, the profiling, you know, we provide a so many so much data to somebody and the, you know, somebody creates a profile, and another guy will use it to disguise passing, and we never know, what's going on over there. And somehow our future is decided by somebody we don't know, that's a really, really scary thing.
Debbie Reynolds 26:36
Well, it is, I don't know, I like technology, but I don't like everything people try with technology. But I also think people have a misunderstanding, and I think they think that technology is perfect. We aren't perfect as humans. So what makes you think that technology is perfect? It was going to come up with a perfect result. So the way we have a blind spot is we think, okay, computers are smarter than me just because I can do things faster than I can. But it can't make judgments, right? So yes.
Takaya Terakawa 27:13
And the technology is always, you know, it's only a tool, so we need to use it, we should not be used by technology.
Debbie Reynolds 27:24
Right? Yeah. I think to me that the analogy that I see that we're doing that’s not good is there was a case in the US where someone was driving a Tesla car. And they decided to get out of the driver's seat and get into the passenger seat. And then the car ran to a tree and killed the people who are in the car. And I feel like, unfortunately, some people abdicate our human responsibility and judgment to technology. So like, to me, okay, you said, man, get out of the car's driver's seat. That is ridiculous.
Takaya Terakawa 28:09
Yeah, yeah. And yeah, but you know, I don’t, I don't think stopping using technology is wise. We need to use technology because it's really beneficial to our society, and it advances our world. But at the same time, we need to be very careful about the consequences of using the technology. So, you know, it's always a kind of communication between technology or, you know, human society and where we want to go. So, yeah, the point is that, you know, not quite criticizing the technology, but how we can create a better world using the new technologies. That's the point in my understanding.
Debbie Reynolds 29:10
Right, exactly. So I give the example you can use a brick to build a house, or you can use a brick and hit someone in their head and hurt them, right. But you don't outlaw bricks; you just stop throwing bricks at people. So me technologies that are that way, right? What are your thoughts?
Takaya Terakawa 29:31
Yeah, I already remember, you know, when the cars first appeared, the doctor said if people ran more than 50 kilometers per hour, people would die. You know, we stay alive, and you know, it's a kind of fear of something new. So we have all we always have fears of new things that appear In the world. Then we sometimes overreacted, and by then, so it's certainly useful and the thanks to the cause so that we can expand our movement and we can go around the many places and the, so it expanded our economy. Indeed, so. So that's my point. So we need to use it, but we need to use it wisely. So we need to make it a future-proofing for the discussion. So the privacy professionals are doing a lot of, you know, privacy risk assessment or doing a lot of discussions around the world. But the so what we are doing is, I think we are trying to, you know, find out figure out where is good, but I think point into society.
Debbie Reynolds 31:00
Yeah, and I'm concerned about regulations being passed that are too about technology. So a lot of regulations are being passed that they're too specific to technology. So there are all these cookie lawsuits happening in Europe, and by the time all this stuff is done, you know, cookies won't even exist, like, cookies were, in my view, mode of transportation, like they were the things that people did with cookies can be done in other ways. So, I feel like we should be talking about more of a harm; if technology companies do certain things, it could possibly harm human rights. If you flip it around and not talk about the technology so specifically and talk about the harm that can happen, I think you have better laws that are more future-proof, right? These don't age out things that don't, you know, the people can't play tricks around, you know, so after cookies, maybe we'll have biscuits or something or crackers? I don't know.
Takaya Terakawa 32:18
So, do you think, you know, criminal prosecution will work for European countries or the United States?
Debbie Reynolds 32:27
Oh, my goodness. Oh, my goodness. I don't know. That's a good question. In the United States, people, for some reason, people who seem to be able to get away with lots of things. So, you know, I'd be concerned that it would criminalize things for people who didn't have kind of the money to be able to fight their way out of stuff. You know, that digital divide concerns me a lot. But I think what has happened in the US that I think will happen more is that we have a law in Illinois called the Biometric Information Privacy Act. And it's currently the most stringent biometric law right now. And so that law, Facebook settled the case of 2020, for $650 million from that law from that state, and as a result of that, we're starting to see other states thinking, well, I want $650 million. So Facebook decided they're not going to do this facial tagging anymore, and probably because of this, right. So, to me, that was a good thing. It's a good thing because, you know, if that type of penalty can deter companies from using data in a way that people aren't okay with, then yeah, so you know, and of criminal penalties, if that's necessary? Yes. I think because it is. I mean, again, it's not just because you didn't physically touch someone; it doesn't mean that you didn't harm them. So I would love to see more definitions of harm to individuals.
Takaya Terakawa 34:18
Yeah, exactly. Yeah, it's really important to, you know, figure out what privacy harms exist in the world. So, we do have some concepts. So maybe we can, you know, clarify more. Yeah, I totally agree with you.
Debbie Reynolds 34:39
If it was the world, according to the Takaya, and we did everything that you said, what would be your wish for privacy, or data protection anywhere in the world, whether it be regulation or technology or about humans? What are your thoughts?
Takaya Terakawa 34:56
Wow, that's a very deep question. Well, I think, you know, the world is quite similar. But a little easier in the, you know, regulation side. So we maybe I will allow, you know, sending data from continent to continent, from Japan to US, Japan, Japan, to EU or everywhere, but the at least so I want the business operators, controllers, where those handling personal information will be more responsible for what they're doing and the so maybe I will promote the, you know, awareness of the privacy harm or, you know why privacy is important to the society. And to try to create a more mutual understanding in the world.
Debbie Reynolds 36:11
Oh, wow, I love it. I think we need we need more mutual understanding. I know, we, I think a lot of times we focus so much on how we're different and not enough about how we're the same. So we're all dealing with the same problems, we may come up with different solutions, but I would love to see some kind of fundamental principles that we can agree or agree about, you know, internationally or globally. Let's say hacking is bad or cybercrime is bad, you know?
Takaya Terakawa 36:46
Yeah. Okay. You know, after all, we're all humans. So yeah, we need to be human all the time or mode.
Debbie Reynolds 36:54
That's right. That's right. Excellent. Well, thank you so much. This is great. Oh, my goodness, we have to find other ways to collaborate. Wonderful. Thank you.
Takaya Terakawa 37:05
Thank you very much. Yeah. I really enjoyed the conversation very much. Thank you.
Debbie Reynolds 37:10
You're welcome. I will talk to you soon.