"The Data Diva" Talks Privacy Podcast

The Data Diva E35 - Dr. Edward Tse and Debbie Reynolds

July 06, 2021 Debbie Reynolds Season 1 Episode 35
"The Data Diva" Talks Privacy Podcast
The Data Diva E35 - Dr. Edward Tse and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, "The Data Diva,” talks to  Dr. Edward Tse, CEO of Parenting in the World of AI.  We discuss online data privacy concerns around children, content online aimed at children, using technology an Ai responsibly,  parents and AI, more anonymity equals less transparency, media focus on anger and aggression, videos and YouTube content for children, the idea of modernity and bias in AI, advertising and targeting aimed at kids, advice for parental involvement with children using technology, social media use by kids,  and his wish for data privacy in the future.



Support the show

 

Edward_Tse

 42:51

SUMMARY KEYWORDS

people, parents, privacy, ai, technology, data, youtube kids, watch, tse, types, sedation, posts, control, interests, called, anonymous, interesting, connect, aware, creativity

SPEAKERS

Debbie Reynolds, Dr. Edward Tse

 

Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds, and this is "The Data Diva Talks" privacy podcast where we discuss Data Privacy issues with industry leaders around the world for the information businesses need to know right now. So today, I have a very special guest from Canada of all places, Dr. Edward Tse. He is the founder, CEO of AI Parenting. He says he is involved in parenting in a world of AI, which I totally love the way he puts that. Also, he says he wants to transform screen time for educators, parents, and learners, from sedation to relation and ultimately creation that's very hip-hoppy. So why don't you tell me? You know, my introduction of you doesn't do you justice in terms of, you know, your skills, abilities, and all the things you're interested in. So tell us how you came to be Dr. Tse?

 

Dr. Edward Tse  01:22

Sure, thanks. First of all, Debbie, for having me on your show, on the podcast. And I'm excited about this program. And I think you're doing some very important work in this space. A little bit about myself. My background is in Computer Science. But even before that, it's always been creativity. It was around grade seven that my French language arts teacher caught me cheating. And instead of punishing me, he decided, hey, he's kind of creative, we'll give him some opportunities to be creative. And everything since then has been about creativity and promoting that. And so we often say, don't sedate. Like there's so much sedation now due to AI technologies, such as like YouTube and YouTube Kids. And we're trying to move from that screen time to quality time. And really, to do that, we need to move to connecting more, and we need to move to creating more. And so that's kind of some of the main messaging behind this. But I also serve as a Privacy Officer for an educational company called NUITEQ. And they make software called Snowflake. And we recently, it's interesting, like I kind of just moved into the privacy role because they needed somebody. But really, it's been such an interesting space right now learning about the privacy side. And we just recently got news back from a third-party rating organization called Education Framework that our privacy rating is now a 4.5 out of 5, which is really like it's really high. We even compared to others in our industry. And a lot of that has to do with, like, looking at the legal implications, looking at what data that we do need to track like that is the center for the service. And I've learned a lot in that process.

 

Debbie Reynolds  03:15

Yeah, that's fantastic. Well, tell me how; I guess I want to figure out how you formed your interests in education as it relates to, you know, parents and children and AI. So how did that come to be?

 

Dr. Edward Tse  03:34

Um, it's interesting, like, when I initially did my Ph.D. work, I did it on the use of like, voice commands, and gestures. And this is like, 2006. That was my Ph.D. topic was very interesting. I asked, what was it like, at my supervisor at the time I, hey, could I like work on video games? And she said, no, you can't. And so I came in on the weekends. And I built this thing, which was like a voice and gesture control over Google Earth, and Warcraft. And I did it over the weekend and showed it, and she's like, wow, this is really good stuff. And then it ended up becoming the topic of my Ph.D. And from there, I always thought, like, oh, we did demonstrations for like the New York Police Department. We did demonstrations for like. I always thought it would be like a military background. But there's a company in Calgary that pursued me. I think it was, sorry, it was Smart Technologies. And they were trying to build something new. And I was involved with the creation of the first digital, let's call it like, multi-touch table product for education. And so we launched this product with them. And I've been kind of in the education space since. And so before, it used to be a lot of, I always thought like, oh, we're gonna build this amazing technology. They're going to change education. But it kind of took me like ten plus years to realize that now, like, technology doesn't change people, people, change people, and so my focus has been on people and being on a show like this one, the podcast has been very encouraging because it's a way of connecting and doing change from within.

 

Debbie Reynolds  05:11

Yeah. I love what you say people change as people. Yeah, I like technology. And I, you know, I've, my interests in privacy, my interest in technology have combined. So I really am excited to talk about anything that connects those two together. But I think where people go wrong, and this is, you know, for me when I look at movies as all this dystopian future stuff, which is situations where humans allow technology to overtake them, or they defer, or they abdicate their responsibility to technology. So I love this. You say humans change humans. Technology has changed things, except this is an interesting thing. So before GPS, I used to read maps. I like maps, right? But now the people have GPS. They don't know how to get around without GPS. And it's like, oh, my goodness, this is ridiculous. So in some ways, I feel like technology change people in bad ways where I think I should know how to get around without a GPS. What do you think?

 

06:23

I think your GPS analogy is really relevant for today. Because of so many of the technologies that we use, we have no idea how they actually work underneath. And this is the key for me has been like, equity, and even like the giant digital divide that we have, is achieving any type of equity or fighting back or, you know, creating a movement, I often say creativity isn't just like drawing pretty pictures. Like ultimately, creativity needs an audience. And if you have an audience, eventually, you can get a community that can follow you. And if you get communities to follow you, then you eventually have a movement. And so, really, it's about disrupting the status quo. And in order to do something like that, you need to know some basic things about the tools that are being used. So yes, in a GPS example. I think that maybe not all the aspects. We don't know how to read the map. You got to know like, how you should be able to navigate, you should know some stuff if you need to. Right. And I do think that there there are like, it's a good opportunity for us to think about what are the things that we still need to know? And what are the things that maybe we don't need to focus as much on and say, in schools, what we do with curriculum all the time? And to like, I've had some discussions with Tara Linney, who did code equity. And one of her messages was that learning coding isn't just this like thing, oh, maybe you like, it's a good thing to learn. It's like if you want to have equity, over half of Gen Z's these days are working on freelance platforms, such as Fiverr or Upwork. And what does it mean, when your boss is no longer a human, it's actually mitigated by artificial intelligence? We need to think about that differently. And your knowledge of AI and your knowledge of like some basics of how computers work is really going to determine whether or not you will thrive or you will be subjected to this system. And so, I do think there are good ways for people to fight back. But I do think like preparing our kids for the future is going to look different than when our parents did. And I have to, like, say, like, we're a judgment-free community because a lot of people don't know this stuff right off the bat. And that's okay. Right? Nobody could have told you like, oh, our society would change this much. Yeah. But I do think like, beyond the technology, we got to look at the society and how much the technology impacts society, we often just consider the technology, but we don't consider like the society implications. But it's the societal implications that are really interesting. Like in the 1960s, the Jetsons, you know, had all this futuristic technology. It was incredible. But they still believe that you know, wives would still stay at home, they wouldn't, they wouldn't be working. They didn't think about the societal changes that would result from a lot of this technology. And I think that that's the part to me that's really interesting is that technology is not just technology. Technology is part of an evolving society. And in the same way, like with privacy, privacy is not just privacy on your data. This is like your rights. How could you like if people know you know more about you, they have more control over your actions? They have more control over your political decisions. And I think that if you want it to gain back control, there is some need, like you said, like to know the map, know how to read the map, know what's really going on underneath. Otherwise, you're just going to be guided to wherever they decided to tell you to go.

 

Debbie Reynolds  10:12

You're brilliant. So this is a fun conversation. I love the depth of analysis, analogy. Yeah, the fact that she was Jane. Was that her name? She was the housewife? Yeah, exactly, I guess Jane, Jane now in the future is like Siri, right Siri is like this 50's housewife who does everything we asked her to.

 

Dr. Edward Tse  10:35

Actually, that's, that's the thing that's kind of interesting is we used to think of them, we used to think of some of the low skilled jobs as maybe working in like fast food, for example, maybe being a cook. But in the future, potentially everything that can be done on Fiverr, or Upwork, will become those low-skilled jobs. And that can include things like programming, and it can include things like writing or doing graphic design, and this is the thing that's really frightening. So that's why I feel it's more and more important to move towards the creativity side. Because those are the things that are really hard for AI to do, and it's really unique to you. So, people, you will have your own style, and nobody, it will be you, nobody else will be able to copy it. And so you're investing in creativity. You're investing in your future. So yeah, I like it.

 

Debbie Reynolds  11:31

Wow. Well, so this, this topic is fascinating to me about parenting and AI about, you know, how parents need to think about algorithms and the things that their children are, are exposed to, you know, tell me, you know, we just need to get into this a bit. So tell me, what are the things that people aren't thinking about? Or what are questions that they are not asking that you wish they'd be asking right now?

 

Dr. Edward Tse  12:04

So that's a great question. In terms of, for, for parents, and what they're thinking, there is a lot of anxiety around, like the amount of time that they're spending in front of screens. And I think they're not entirely clear, like how the systems like YouTube are recommending different videos for their children and how all of this works. And I feel that one of the things that they could ask that they aren't asking right now is about what's happening underneath, right? Well, how do I get these recommendations? Why do like, why do we see certain things? I can't remember how many parents I speak to that they're like, I hate Kaiju, or I hate bee, or I hate like, certain types of channels. Why did those? Why does Peppa, you know, you're at the top. And I think one of the things that people don't realize is there's kind of an unconscious system here. I described AI as this thing where AI is creepy, right? Like the stuff that it knows about people and the stuff that it can act on. And what happens every time it becomes become aware of the AI is that, oh, there's a lot of like public outrage. And then, as a result, AI has to go more and more underground. It has to become more hidden. The example was, like, you've heard of the story of Target dad, right? Like, he discovered that his daughter was pregnant, even though he didn't know that himself. So it was more aware of what was happening in his family than he was. And in response to that Target, didn't stop. Like the key message here is they didn't stop. They stopped the creepiness, but it didn't stop the creeping. And this is very common like history repeats itself. That's exactly what's happening right now. Like we are moving from the era of, oh, full-on cookie tracking, with your email on it to that slightly more anonymous version, they are not stopped. There is no stopping of the tracking that they are doing on you. What they are doing is they're removing like they're stopping the creepiness, but not the creeping in this case, because what they did is they just took the like, for example, in Target. The solution was pretty simple. There's just you take that, and you throw in a couple of other random ads. So there's a super targeted one, and you put some other ones, and people are none the wiser about what is going on. In the same way, we throw in a bunch of ads now into your feed. It looks at the regular posts, and some of them are going to be way off, and that's on purpose. Right? Because everyone was super well targeted. You'd be like, oh, this is so good. It knows so much about me. And in the same way. Like, as you're probably already aware like there are all those different phases that we're moving to in the privacy space, right like on the street like ID cookie tracking to like a session ID or like a mix panel type style to a FLOC type style, like a Federated learning style. And in each one of these phases, yes, it becomes technically more anonymous. But what it also means is that you have less rights to this data because it's not you, technically, and you have less ability for any recourse if a decision is made by AI. First of all, it's not you; it's an objective decision. Like, these are like these systems have huge bias built into them. I think many people have already talked about the types of bias in AI and your ability to fight back against that, to say your decision, the AI decision is not correct. There's no recourse. There's less and less recourse, especially when you don't control or have any access to that data. And so, to me, that's the growing trend that I see in the privacy space of yes, it's going to be more anonymous, but that also comes with some consequences in terms of your ability to correct that information and your rights to that information.

 

Debbie Reynolds  16:12

Right. So the more anonymous it becomes, the less transparent it is, as well. So, you know, right, because once they anonymize, they don't really have to tell you anything about that data. But, you know, just like, and when people are thinking about FLOC for Google and Google and things like that, you know, to me, a credit score is a FLOC, as far as I'm concerned. Because basically, you know, you give like like, say you want to buy a car or house, you give them your information. They're doing risk modeling on you based on other information that they have from other people or other systems. You know what I'm saying. So they are judging you, not just on yourself, but then they're also rating you on kind of these other anonymized people. So to me, that's what FLOC is. So it's not a new thing, in my opinion.

 

Dr. Edward Tse  17:09

Yeah, I think the other aspect when it comes to, like, buried underground, like a FLOC is certainly one of them. But think of it like, we're moving from conscious input, which used to be like the likes, you know, comments shares, to very much unconscious input, like Facebook, you have both Facebook and YouTube focus a lot on watch time. And we don't like our content like we if we scroll. I think Facebook's VP in India said that we scroll the equivalent of a Statue of Liberty worth of posts every single day, and your milliseconds of watch time for each one of those posts tell us an enormous amount about what is interesting to you. And when we live in a world where all of that information, those milliseconds of watch time, is tracked, we can gain a very detailed profile without you having clicked on anything in particular. And I think that people don't realize that it's one of the reasons why we are getting much more aggressive posts before I was talking about it. Peppa, Peppa is very cruel to her brother. And what we've learned in this time is that these types of aggressive things attract the really young mind more, so they'll actually sit there, and they'll watch it longer. In the same way, Kaiju is the same is very mean to his siblings. And this cruelty towards others, it turns out, like we, when you think about unconscious input, like those milliseconds of watch time. It's not a conscious mind. Like it's not even like our like for brains that are controlling that time, how much time we spend on it. It's very subconscious. And so now we're back to the fundamentals of like sex and aggression, right? So those two people are spending more time on things that are more aggressive. We know this and so do like most media, and that's why there's so much focus on things that are more aggressive. Go to Twitter, like there's a lot of anger because anger means longer watch time. And anger means more ads shown, and if that's what they optimize for, they say they optimized for what people want, but what they're saying is actually we optimize for what people's subconscious wants. Yeah, not necessarily what they would say that they want. And those are two different things.

 

Debbie Reynolds  19:50

You have pointed to Kaiju and Peppa. I want to make sure people know who they are.

 

Dr. Edward Tse  19:55

Yeah. They're cartoons some kids watch. That one has an animated pig; Peppa is an animated pig who has a sibling. And there is Kaiju. Who, like it's it's a cartoon as well. So there are just two cartoons you can watch on YouTube Kids.

 

Debbie Reynolds  20:15

Cool. Okay, that makes sense. I don't know. I was, I guess, in retrospect, I was very fortunate that my parents were very strict on us growing up about kind of the things that we watched and sort of our influences. So, you know, we, I grew up in a house where we have one TV, so everyone had to watch, you know, one TV in a common area. So we all have to sort of share that. And then our parents limited our TV watching severely. So we didn't watch a lot of TV. And I think that's one reason why I don't watch a lot of TV now because I had that experience growing up. But I can think back to growing up. If we have YouTube and cell phones, I would imagine that would be maddening. Because there are so many other ways that, you know, the outside can get in, you know, to impact your kid, right. So for us, it was like the Sears catalog was common and had all the toys in it. And we try to tell our parents about what we wanted. Or maybe there'll be a commercial in a cartoon or something like that. But, you know, I felt like that was at least manageable, you know, for parents, but I can't imagine what parents go through right now with, you know, TV, with tablets, you know, everything's you could think of was trying to sell someone something to try to gather some information from you, what are your thoughts?

 

Dr. Edward Tse  21:46

So, I love what you said. And I think it's. It's not just yourself, I think, maybe parents these days are in exactly the same boat when they grew up, everything was off like they didn't have that much technology. So technology was in a public space, right. And so what you did was very closely monitored. And I think that that was a really good benefit. And for many parents these days, it like, we grew up without that technology. And so now what you see, like, happening online seems so wrong. But if you think about like, from our kid's generation, they've known nothing but this, and so it doesn't seem wrong to them. And I think that that's one of the things that's causing a big divide between parents and their kids, is that the kids will say like, oh, my parents just totally don't understand what we're going through. And I think I'm having some understanding of what is different, like, what's the difference between YouTube just showing you a bunch of videos versus the TV? Well, when somebody was selecting the programming for TV, that was like a person, and there were very specific rules like there were laws on what that person could show you versus not show you. And a lot of those rules do not apply in the AI world. I'm just gonna recommend something to that individual. And so you will see stuff. And it's one of the reasons why we had Elsagate is one of the reasons why there are certain types of very aggressive types of like, Elsa videos, why would that bubble up to the top? Because, like, again, it's a subconscious watch time, but it's not regulated, because they're, they have some, they have those 501 protections like that they have certain protections that, yeah, that reduce safe harbor that reduces liability for these organizations.

 

Debbie Reynolds  23:35

Right, right. Because when contact, so I'll give an example of a problem that technology can't really solve that think you need a human assault. And, and I think that they're actually trying to have I tried to help with this. But many years ago, there became this whole big fad about creating coloring books for adults, right. And so many of those things in the past have been people naturally associated a coloring book with a child, you know, something that a child will use. So then, on these certain channels, or certain things that were publicized, and these things, they were automatically associated coloring book with a child. So they were actually advertising adult content to children, you know because they were coloring books. What are your thoughts about something like that?

 

Dr. Edward Tse  24:35

I think in terms of guiding parents, like, this is one of the key things that that we say, is like when you're moving from sedation, like just you watch the shows, so that, you know, I can do other things to relation. What we're doing is we're showing an interest in their interests. And it's not because, like, the way I the analogy, I use its playgrounds. It's not that our parents loved who they found. They were so excited to go and visit the next playground when we were young. But they would still go. And they would still be involved because they wanted to see, well, that's where the friends were. And that's where other people that you could hang out with were. And so they participated in that because they cared about you. And I think in the same way, if the playground is moving from the physical playground to the digital playground, on that, it's about showing an interest in your own kids and saying, yes, those kinds of things are interesting to me. And the whole point of this is to build this relationship to the point where you can at least have some prediction of how would they react if they saw something that made them angry online? Or what would happen if they saw something mean said about them online? Being able to understand that type of reaction is especially online, is very, it's a very important skill. And so it seems kind of childish, this coloring book that you were speaking about. But if they're into coloring books, how to get into it, right? Like I feel if that's what really what they're driving, like, what could you do to show an interest and maybe guide them to the next step? And that's what they're often looking for? It's there. They're at a certain level on Roblox. Is there a certain level in terms of their coloring? Do they want to improve? How would they, how would they learn to improve, you could maybe guide them in that space. And that's a great way to connect, and doesn't always have to be, like on a computer, like maybe there are other ways you can do it as well,

 

Debbie Reynolds  26:28

I don't envy, you know, parents and what they have to do, I'm not a parent, but I've been an aunt, I have one niece, she's more like a tin foil hat person like me, so she won't even comment or anything on social media, because, you know, she may watch things, but she won't comment or say anything, which is fine. But, you know, you give us some pretty practical tips about what some parents can do, you know, when a kid is out of your, you know, area, so like, let's say they're at school, or they're somewhere else, you can't really control what happens there by the educational sense. Obviously, you want to be involved with what they're doing. But what about when they're not in school? What about when they're at home? What tips would you think that would help parents, you know, be able to manage this better? Hmm.

 

Dr. Edward Tse  27:30

It's interesting like we start off talking a little bit about the digital privacy space. But then there's also the home privacy space as well. And one of the things that I found that kids search for often on Google is, Why did my parents take away the door, say, in my bedroom? That's like a general concern for a lot of parents. And it's interesting, because like, you described it as, when we were growing up, we didn't have that privacy on digital devices, we didn't have devices, we could bring to our bedrooms, and then use the kind of in an in a very private manner. And now, there is a much stronger expectation of privacy from our kids. And so I want them to be like, as a parent, I do want my kids to have that level of independence, I want them to be thinking that. Yeah, like, eventually, I'm gonna have to manage all of this on my own. But the key there is, can we talk about it? Like, can we still be a part of that conversation? And I think that they feel really disconnected right now. Like it was because of that divide. Like, we grew up in a generation that had didn't have these technologies. And now everything seems wrong. But it also could mean that we feel very critical about their interests, even though there's nothing necessarily wrong with if they really like Peppa, that's not necessarily a bad thing, right? There are lots of things we can talk about that are good things are covered in it. And so we have to look on the bright side for those things. It's like, oh, but now they're interested. I'm sure our parents were the same way. Right? Like when we started discovering video games, or, or movies or whatever, like, oh, look like they have access to and we didn't have access to. And I think that's normal. I think the key is in the home. The technology, all of these things, like even the kids at home, is kind of like it's not a permanent thing. But the relationship is the thing that lasts a long time. So I spent a lot of time focused on that because that's the part that's really tough to repair or to change afterward. And so it's important like setting some rules and guidelines that work so that they understand why not just like do it because I say so like there's a reason behind what you're suggesting. And it kind of points back to while this I'm doing this because I believe this is going to best prepare you for the future. And yeah, like sometimes I need you to take a break. Sometimes they get really frustrated and like, you don't want, I'm here to help, right? Like, I'm here to make it interesting for you. And so a lot of my time is just taking some stuff from school, for example, and figuring out how can I connect this to their interests? I know that they're really into Star Wars, or they're really into watching Ninjago or something right now. It's like, okay, great, how can I connect what they're doing in school to that. And it was this kind of realization, like I mentioned, that grade seven French class, where you just start to realize that there are no boring tasks, they're just boring imaginations. And so, if you can connect to what they are interested in, you're going to be so much more likely to want to do it intrinsically. And we can't really force anybody to do anything these days. And it's very different. Right? We're not in the era where you can just go, okay, I'm going to control what you do online. It's very hard to do that. Yeah. So they will need to have more independence. So how do we make sure that they make the right decisions? When they get to that point? Or maybe if they're already beyond that? How do we connect with them enough so that we can be aware of what's going on?

 

Debbie Reynolds  31:09

Yeah. The future is about our gaze, right? What our eyeballs touch, you know, as you said, you know, marketers are looking at how long you've looked at things. And that's telling them things about you. Right? I hope you saw the documentary, The Social Network. One thing in that documentary that was concerning for me, obviously, as a tech person, I knew a lot about, you know, what they were talking about. And I liked the way that they broke it down to simple levels. But the thing that really stuck out for me, and I think it's something that we've talked about here, which is the psychological manipulation that sometimes happens with them, so there's an example in the documentary where this guy broke up with this girl. And he was not as interested in interacting or looking at stuff on social media. So they started throwing up updates about this girl, who had broken up with him, or whatever he was, it was like a very, it was very painful reminders. But this was a way for the marketers or whoever, to get him to be more engaged with, you know, the content and looked at it more. So to me, the issue there is that being able to capture someone's attention, and for something that they were very hurt by, you know, it may help the marketer, right? But it doesn't help the person, the child, right. So what are your thoughts about that?

 

Dr. Edward Tse  32:52

You're absolutely right. People experience trauma in their lifetime. And trauma, I heard from George Valenzuela, that it's very much like a, like, you have a physical injury, you go to the hospital, you get somebody to help you with it. And it seems like for the emotional trauma that we experienced, we kind of think that that's normal. And we shouldn't have to go to speak to anybody, but you are actually hurt. You broke up with somebody. You actually do need to talk to someone. And I think that this is the issue is that the technology can kind of sedate it a little bit, they can point you to other things that kind of feel as they satisfy, but they don't. And none of that actually solves the more core issue. And so it's led to this, this secondary pandemic of like, people's mental health. And it's made things far worse for a lot of people because instead of seeking help, you can just spend more time browsing posts or getting involved with social media. And there isn't anything, there's no rules or legislation besides like, hey, you should go and call these. But definitely, we see a, like a major uptick. And I think that the way I've heard it described before is that the mind is this very scary neighborhood. And especially as we are isolating at home, you don't want to go there alone. And more and more people are like being alone, and they're not speaking to people, but you can still reach out as you can still, and that's what we don't do. Like it's all just business meetings, or it's all just like, oh, it's a work thing. And we don't just casually go and say hey, we're gonna have a like a coffee, virtual thing and we're just gonna hang out. We should. We should because our mental health depends on it. We need that we are not meant to struggle to live to work to Learn alone. We were meant to do this as a community. We were meant to do these like families or as, as groups of friends that are all together. And I think that the more we kind of say, oh, because of this, like I have to isolate, it's like, yeah, you physically are isolating, but you don't need to psychologically isolate. And so this, this world that we're in is very much one around the mind. And around how we know that just subtle, like things that we show, can have a huge impact on your mood on pretty much what you buy, like, you name it. Political decisions, we know that there there are huge changes that happen as a result of people's see. And so there's this lack of there's no sense there's no like, content, moderation, or those types of things. It results in whatever generates more watch time. And this watch time, priority among big tech companies is is becoming pandemic or a lot of the mental health situation, it may be good for selling people stuff because high anxiety individuals or people who are really stressed out like they tend to buy things to reduce that stress. But it's not really it's like a patch, but not really a cure. Right? Or the root cause.

 

Debbie Reynolds  36:29

So if it was your world, if it was the world according to Dr. Ed Tse, what would be your wish for privacy, whether it be something in technology, AI regulation anywhere in the world.

 

Dr. Edward Tse  36:51

So can I ask you a question then about that one? Just to clarify. So are you saying like, if we could change anything in terms of policy, or if we could change anything in terms of technology, or change anything in terms of like education, like anything, your parents are those kinds of things,

 

Debbie Reynolds  37:07

Whatever will be your wish will be or, you know, if you wave your magic wand, anything that you will want to change that relates to privacy in any realm, what would it be?

 

Dr. Edward Tse  37:27

We are starting to see a little bit more rights around people's own data, wherein some states are using land terminology in order to describe the people's data, as you're aware. And I think that increasingly, we're starting to realize that this data is more than just, oh, yeah, like you liked a few things. This data is very much a profile of your unconscious mind, potentially a better profile than then you are aware of or that your friends are aware of, or even your spouse is aware of, or your closest family members. And we don't have a lot of control as a society when it comes to this data. And I would hope that in the future, that this data is something that people are aware is being tracked. They have way more control and way more right to them. And then you start to understand like this is how things change, and being able to quickly move like from one platform to another or say like, hey, you're not allowed to have this data anymore. Because as a consumer of this material, you have no, oh, yes. Meredith Whitaker said that concentrated power is being masked as technology innovation. And what this means is that big tech is, has increasingly more power, potentially more power than some countries over what we do. We like to end our careers, and it used to be the case like when we were growing up, like playing with computers, and that was all optional. It was for fun. We enjoyed doing it. But nowadays, it's not. I have no profile online whatsoever. Then if somebody is hiring, how would they know what you can do? And I think that we need to start saying like this is important, and we need we will not have control of ourselves and our own society without control of this over this data. Right. If something happened a long time ago, we didn't record those things. So we have the advantage that, like, we have all the stuff that we know that we learned the hard way. But they don't have that like our children won't have those unless we fight for those types of rights. And so ideally, we live in a society where we have the right to say no, to a lot of companies for say, Yes, I said yes. Before but you know, now I changed my mind. And that gives control back to us as families as a citizen.

 

Debbie Reynolds  40:29

Right. Wow, that's fascinating. Fascinating. Well, thank you so much. This is a really important episode. We never talked about education and parenting and stuff. And I know a lot of people have these challenges, and I'm glad that we were able to talk about them.

 

Dr. Edward Tse  40:49

Absolutely, I'm glad that we can share this. Do you mind if I mentioned like if people want to learn a little bit more, we just have a website called AIparenting.live. We do like a weekly live stream. I'm hoping to have it on there. Because she's got some really good insights as well. And really, our goal is just to help parents move from screentime to quality time. So there's an insider's mail list. You can go and check it out if you think that's valuable for your family. And I have other resources as well and planning on doing maybe a summer camp for kids and AI summer camp, where I want to talk about not just AI, but I want to talk about society. And it'll be these like for younger kids, like, you know, people who can watch who watch youtube kids. It's just like, what is AI? How does it what are some of the basics? Why do you see what you see on YouTube, kids? Why does it make you angry? Yeah, those types of things. And so understanding like a little bit about anger, and what do we do in that situation? It's as much emotional control as it is AI.

 

Debbie Reynolds  41:54

Absolutely because they're connected. Absolutely. Absolutely. Well, thank you so much. This is great, and I'm looking forward to being on your show. Thank you so much, Debbie. Appreciate it. All right.