"The Data Diva" Talks Privacy Podcast

The Data Diva E40 - John Arts and Debbie Reynolds

August 10, 2021 Season 1 Episode 40
"The Data Diva" Talks Privacy Podcast
The Data Diva E40 - John Arts and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds “The Data Diva” talks to John Arts, Co-Founder of Rita Personal Data, an app designed to give individuals control, visualization, and transparency of their data. We discuss data agency for individuals, the influence of GDPR on the data landscape, cookie control, smart devices gathering data, how respect for security and privacy can be a business advantage, the asymmetry of value with free services, limiting 3rd party risk, consumer trust leads to more accurate data, new situations need new models and his hope for future.

Support the show

42:18
SUMMARY KEYWORDS
data, people, user, companies, privacy, transparency, consumer, consent, brands, service, individuals, sharing, ads, problem, future, customer, years, thoughts, cookie, understandable
SPEAKERS
Debbie Reynolds, John Arts

Debbie Reynolds  00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva Talks" Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know right now. Today, I have a special guest on the show from Brussels, Belgium that correct? John Arts who is the co-founder of Rita personal data, John and I had a great conversation great meeting a few days ago, and I say, oh, you should totally be on this podcast because I really love what you guys are doing with their product. And I really like the way that you're thinking about the data of individuals in terms of how consumers can control their information. So why don't you tell us a little bit about you and a bit about the company?

John Arts  01:07
Yes, thanks a lot, Debbie. It's a pleasure to be here. And so Rita's mission is quite straightforward. We want to put users back in control and give them ownership of their data. So the way we do that is that we allow users to first of all aggregate all of their information from multiple platforms, we then visualize it for them. And then we give them the tools to say yes and no to the companies that they want to have access to our data. For us, the starting story was really on my co-founder and me not having a clear understanding of the data landscape, we were very curious about how are people using our data? What data is being collected? And who are they sharing it with? And that first step of creating curiosity was really core to our product development. And that's why we really wanted to give users transparency and really shed light on this whole industry. And that's what we do through visualizing data. And then the call to action in the mobile app is really to give users the choice, that company. Yes, I have a relationship with them yes, I trust them. They can use my data. Others I don't know, I don't interact with delete my data. That's really our core premise.

Debbie Reynolds  02:15
Yeah, I think, when especially the GDPR or the GDPR came out, and we're saying people are companies, countries and different municipalities take bits and pieces from the GDPR is really interesting, because I remember when the GDPR first came out, and people, a lot of business, people are just pulling your hair out about the whole consent issue. So you know, as you know, in the GDPR, there are six legal bases for moving data or using someone's data. And people fought companies fought tooth and nail and are still fighting tooth and nail to use those other five bases, and they don't really want to go to consent. So I think it's interesting that you all are going for consent, because, you know, consent is fragile, as you know, people, you know, can revoke consent, they can consent to certain things. So I think creating a situation where consent is not considered a dirty word is a great thing. What are your thoughts?

John Arts  03:27
Yeah, consent starts with transparency and understanding what you're opting into. And that's, again, why our users are just so excited by knowing okay, this is what they're talking about when we're speaking about data. Because putting a big consent button on a webpage, it doesn't mean much, it's not very tangible, people are clicking it away, we're really turning around that whole model. And first of all, getting to the user giving the data to the users giving them full access to it, and then allowing them to also proactively make those decisions after the fact. So that's a bit the model that we rely on. It's interesting, you say the GDPR because it was a big inspiration to my co-founder and I, when we started, we were just so inspired by this transformative piece of legislation, and more precisely the right to access. And that's also really where the name Rita comes from. And the right to access stipulates that every EU citizen has the right to request access to any company of their data and saying, hey, give me what you have on me. And that's the first thing he and I did scan I and I did. We went to Google, we went to phase when you say hey, we now have this right? Give me all the data that you've got on me. And there was for us, again, purely for a curiosity purpose. But what that enables is really giving users ownership and putting it back in their hands. And that first step now is all-around awareness, education, as you said, making consent a lot more understandable, but what we're going towards as a company in the future, is really putting the user central in the data economy that is being built. And that's really what our users are after and what we believe the landscape should look like.

Debbie Reynolds  05:16
Yeah, you know, there's a lot of controversy around cookies. And how people do cookie banners. You know, I, when I'm in Europe, I get less cookie requests, right? Because you all have a law saying, you know, people can only ask so many times, right, and we don't have that in the US. So we get bombarded with stuff. And we don't even have cookie laws. So if it's like, when the GDPR came out, a lot of companies, they didn't want to separate or they couldn't separate, you know, EU citizens or EU persons from people in the US. So they have done all these kinds of blanket cookie things, and they are just out of hand, in my opinion, especially every website, you go to tries, you know, it gives you an except all option. And then they try to slow you down without you know, make a preference. So you go into this other window, then you have to make all these choices. And the thing I like about what you guys are doing is that you're allowing people to sit down and proactively decide what they want, who they want to choose who they want to choose to share their data with, as opposed to you go on the internet, you want to do something and then it's like this roadblock, you have to like answer all these questions. And a lot of as you know, a lot of people just to get through it, they'll just say, Yes. Because that's the easiest button there. Because this is not a yes or no. is like a yes. And you don't maybe these other things, but you know, you're interacting with, you know, dozens of websites. So they that's like, you know, this really is wearing on my, you know, mental health, they have to actually do make all these choices. What are your thoughts?

John Arts  07:08
Right? It's an interesting point, the way we talk about it internally is kind of a trade-off between being privacy safe and convenient. You are taking more efforts to operate on a private safe incident, whether it's the tools out there, whether it's going through the privacy policies, and understanding the terms and conditions of which you're accepting cookies. And that's really why it's, you know, we promised ourselves in a long debate with designers there to make privacy sexy and accessible, to kind of make it really smooth, understandable, so that you can have both, let's say, operate in privacy, interact with Internet services in the privacy of way, but also out of convenience. And it is a challenge, you know, it is a challenge for even companies I think in EU what is really interesting to see is that some companies are taking the challenge head-on, and are really saying, Hey, we can use this as a competitive advantage, create a very clear cookie banner, or even use some less invasive first-party tracking with third-party cookie tracking. And those companies are benefiting from that. On the other hand, there's still a large majority of companies that are seeing this change in the landscape from a regulatory perspective, from a technology perspective, the banning of third-party cookies. And from a user perspective, and they're not really seeing where the next 10 years ago and still trying to attach to this, hey, we can still get data from first-party cookies. Let's try to get as much as we can. And then some of them are doing it like some data brokers are just making the terms and conditions increasingly more difficult, and making the letters even smaller and less understandable. So you know, one great example who would do who's doing it? Well, for us is the BBC. in the UK, they have inspired even us in the way we should be writing our privacy policy. It's so great to see that, you know, these leaders and companies with quite a social mission and responsibility are doing it well. Right. They are providing that transparency. And I think we'll see a bit of a gap for the first couple of years. And the companies that take privacy seriously as part of their mission and really care for the customer's privacy will be the ones that come out best.

Debbie Reynolds  09:33
Yeah, I agree with that. And I agree that BBC has a good policy. The Guardian has a great one. Hearst is really good in terms of how they articulate their privacy policies and cookie stuff as well. I think that what will I give an example of something that we're dealing with in the US which is kind of crazy. So There's a heatwave going on right now in Texas and the Texas power companies are adjusting the temperature of people's thermostats. Because for people who have smart thermostats, and fined up to this kind of energy project, so like, if your house is hot, let's say you go sleep at night, they'll turn it up. So it's hotter in your house, so you're using less energy. And a lot of people are really outraged about this, even though if you read the fine print, a lot of them have agreed to this as part of this smart thermostat program or whatever. And a lot of people are opting out now. So to me, part of the problem, you know, people say, oh, well, you know, those people should have read those 80 pages, you know, policies or whatever. And so to me, I think that I would love to get to a place where companies are, you know, not writing 80 page policies that people have to read, because they know that people are not going to read those things. And I think if you have told them plainly, you know when you have a heatwave, we're going to control your smart meter, make it hotter in your house, but you're not gonna have as much control over that stuff. I think people understand that. But you know, what, what are your thoughts about this? company's trying to be or, or endeavoring to be more transparent and more clear about what they're offering? And what are the trade-offs for individuals?

John Arts  11:36
Yes, so absolutely clarity, that is everything. And as I said, it's a real challenge. And we've when we set up our privacy policy, we did that consulting industry experts doing for durations with our community. So that becomes really crystal clear. And then the two sides again, the trade-off that we ended up with is transparency on the one hand, and understandability on the other, because you need to be very transparent with the more transparent the more granular you go into explaining what you're doing, the less understandable this user is. So it is really a challenge. And we've written a blog about how we did it, I think looking at great use cases that are out there. For example, the Belgian government that also fantastic use case explaining what data is actually going to be collected by the COVID-19 tracking app. Speaking to industry experts, there's so much research around this, that we can learn a lot from and then asking the community asking the reader, Is this clear? What is missing? Is it missing transparency, meaning missing understandability, and then just create a very personalized approach? So the high-level user wants to go through it really quickly. Another one will want to go into detail they can go into detail now on the case that you brought up in Texas, and what I'm curious about is, to what degree is that manual adjustments of temperatures and temperatures in the houses of the customers? Or is it all algorithmic?

Debbie Reynolds  13:13
Algorithmic, so because they have smart meters that are connected, they can actually manipulate those meters on a mass scale. So I don't have a smart meter, because I like a dumb meter, one that I can control. But I think people who are getting those meters probably didn't fully understand that someone else can control the temperature in their house, they don't know what's happening in your house, right? They don't know what your temperature situation is, or, you know, you may have someone who's sick and he's cooling or, you know, I think, to me, that's an example of kind of, you know, technology has gone awry because you're making these blunt slit decisions about people without knowing exactly what their needs are.

John Arts  14:11
Yeah, it is an interesting example. And I think one other case in which that happens is in ways in ways was an interesting case, because in before ways the in Belgium, we call it the GPS, the Global Positioning System, that navigates drivers around streets, then the system will always find either the shortest route or the fastest routes. And what ways did for the first time was to split up certain people to manage the traffic and they would say, for example, if we both take the same route, they would send you one way and mean another way so that their media they mitigate traffic, and don't send all of their users to the same streets and get Too much traffic. So what that did for the first time is really making decisions for the user they choose in your way. And the algorithm is deciding when you go that way or this way, one could say that has an immense implication of what happened in your life, maybe you, you made the deportment of your life on a certain way, and you didn't. So it is interesting to see increasingly algorithmic decision making, what comes, what becomes important there is the transparency and then the user opting in to that in with full awareness of it. So some we could reflect it to the personalization story, we've got some users who really like personalization, but still want to control their data. There's other people that saying, I feel like I'm put inside a box and it limits my choice. I want some serendipity in the ads in the content I see. So what I think the general takeaways from at least what we've found with our user base, is that the more understanding you have on how the algorithm works, why a certain thing is recommended to you? Or why an algorithm is making that decision, the more they are okay with it? Because, for example, if there's a really invasive ad, the question is, how did they get that data? Right? It's all a mystery to me. And that's what makes it invasive. The more you educate that individual, and people are becoming more educated, saying, and you don't you're like you're really personalized Netflix suggestion? Yes, Yes, we do. And that, okay, they'll need that data for it. And speaking to more marketing professionals, or people that are more aware of the data algorithms behind it, they enjoy personalization, but they also want privacy. So there's, again, a hard trade-off to manage both. And the answer to me is always transparency in education.

Debbie Reynolds  16:51
Yeah, I have, you know, not to pick on any one service, movie service, but I have two different profiles on a service and one has a woman's name, and one has a man's. And without me doing anything, they're suggesting different movies to me, which is troubling. You know, so just because I'm a woman, that doesn't mean I want to watch all the bodice ripper, you know, romantic comedy things, or just because I'm a man doesn't mean I want to look at the shoot 'em up, Bang Bang stuff, either. So I think, to me, I'm concerned about people being in silos. So you're, you know, you're in a box, you're, they're putting you down a trail, you can't really see anything else. But you don't know that there is anything else to see. So I feel like so many people on their journey when they're looking at websites and stuff is just not apparent to them. Like all that there is out there. Like they're being presented with the world as if this is everything, you know, this is everything that there is see. And it really isn't that way.

John Arts  18:08
Yeah, it is a bit freaky. And I think what a lot of services will do in the future is to give two options, saying, hey, here's a personalized one. And here's a completely randomized one. So you can explore both, what middleware the optimal middle way for us is, and we offer this feature in our app in which you can choose, you can see a list of all the interests that Google or Facebook used to target you. So it would say on my site, data marketing, maybe something on blockchain, and I can restrict that interests. So that means if a user is constantly seeing these crypto or blockchain ads, they're tired of seeing it, they can go on reading and say, hey, I don't want to see that anymore. And it's really a win-win for both the user and the market here, the service, the service provider, offering a product or a service. And that choice and an active choosing, hey, I want to see this kind of content. Same with your Netflix one day, you really be into whole history movies, you can actively choose that and you know, letting the user be part of the algorithm and saying yes, I want serendipity. No, I'm hyper-personalization. So I think that's where the future is going.

Debbie Reynolds  19:25
Yeah. And I think, to me, this should be the decade of transparency. So if companies don't understand anything, they should understand that, you know, people want more transparency into their data and how their data is used. So companies that are not or don't want to be transparent. No, they're going to fall behind because that's really what people want. What are your thoughts?

John Arts  19:54
Yeah, absolutely. I think we touched upon it. A few minutes ago, briefly. But companies are also coming to us and asking, Hey, you guys did a great job visualizing that the making that whole data journey of a user very understandable. How can we do the same, and those are going to be the winners. The way we see it, internally is this shift of user-centric data and Privacy Awareness is really as big as the shift from renewables and sustainable products and energy in the last 20 years. I mean, 2010, it was Petro China and Exxon Mobil at the top of a market cap, globally. And just in 10 years, we've seen that completely crumble because of sustainability, trends, and understanding of the global problem we're facing. In in that space, then, secondly, all the new technologies that are out there that can kind of replace their market dominance, and we see the same happening to big tech, we're not going to say they're going to fall, but it's crumbling, right, really out of three pillars of change. First of all, the user awareness is incredibly powerful, technological changes. And then the regulatory tailwind, which is really pushing on rights, like the right to access the right to portability, allowing users to export their data from one service and import that into another. This is going to break down all of the existing silos in the data landscape. And we see that being as the largest change in the next 10 years, really, that shift towards okay, data has become opaque and understandable to the user and non controlled by the user towards a whole movement of transparency, and user control the same way that the whole sustainability wave, increased transparency and more ethical practices among companies.

Debbie Reynolds  21:59
Yeah, I think, you know, businesses that are smart will embrace this and chase this, right. You know, you want informed consumers, you want people that they trust, you want people to share information from, you know, with you and get value from your service. So if you're doing that, you know, I think, up to this point, a lot of people have seen privacy as almost attacks on our business, they're like, oh, that's just another regulatory thing that I need to do. And it's not really about, you know, your customer, so your customer can vote with their feet, they can move to other services, they can do other things. So if you want to have better relationships, and better customer satisfaction, I think that has to be considered it has to be you have to consider the fact that, you know, business companies or individuals want to trust or what, you know, individuals don't mind sharing information with companies that they trust, and they feel like they're getting the value. So I want to ask you a little bit about that sort of that the asymmetry of value, okay, as it relates to, especially apps or services that are free. So there is, you know, like, if you go, you know, just pick on Facebook for today, but let's say, you know, you have a Facebook account, let's say hypothetically, they earn $5,000, a year from you, because of the stuff that they are able to clear for you. And then, you know, I will ask the question, you know, am I getting $5,000 worth of value from Facebook? And the answer, maybe no, right. But there is this asymmetry that I feel like exists in certain applications when they're free because you can't disguise almost a limit to what people can consent to. So I think consent is like a huge center or issue that's coming up and almost like the smart thermostat. For example, you know, I'm sure that people want right now, while they're going through heat rate or thinking, you know, this is not an even exchange. You know, I don't feel like this is, you know, I don't feel like the service is giving me as much benefit as they get from me. What are your thoughts?

John Arts  24:45
Yeah, there are asymmetries there are huge asymmetries. I like the one that you brought up value. I think the first one we touched upon already, it's information asymmetry. Companies knowing so much about you, the end consumer, and you not knowing what they know, right? So there's huge downsides and a very societal level of having such asymmetry between organizations and individuals. But then the value of symmetry is also an important one. Because as we spend our days increasingly online, we generate value. And that value is really generated in two ways. That's one of the attentions that we give to online platforms. And the second one is data. And the first one, the attention is because Google and Facebook, they can present you with an ad, and they can increase their probability, the probability of you clicking on that ads by using your data. So these are quite two important pillars of the monetization, there is a great company out there really, an inspiration to us brave. It's a browser I use personally. And they have a token, which is a basic attention token, and you can kind of opt in to seeing ads, but you're the one who gets paid for seeing those ads. And there's a lot of other business models around there. But the idea of monetizing your attention, you're monetizing. The fact that you can see ads yourself. Now the other side. And that's really where our customer base is focused, driven us to focus on is the monetization of data. And it's really that goldmine of those that information, black and white. And that is therefore for us. The next step is once we've aggregated that user data stored centrally on their devices, and giving them ownership, our customers have asked, I want to play an active role in the data markets, I want to be able to share that data with brands that I'm loyalty with brands that I trust, and get rewarded for the exchange. And that is the data economy side of the story. And that data economy is increasingly growing, right, the exchange of data between corporations themselves, but also the extraction of user data through all of these sources and monetizing that to organizations, again, we're really going towards the data economy and the way we see that it's the user central in the decision making and the capturing of the value of their data. Because if you put it in their hands, you educate them, and then you put them in the driving seat of the choices, right saying, yes, this company, you can share it with the other one? No, that would just be a fantastic future. And that's the one we're building.

Debbie Reynolds  27:27
Yeah, there seems to be, you know, a lot of Data Privacy laws. And this, again, is highlighted in GDPR. And other subsequent laws. And this is about kind of onward transfer or third party data sharing. So, this includes data brokers, and cookies and marketing, and all this type of stuff. So what we're seeing what I'm seeing is companies, big companies trying to limit their third party risk with tracking, transferring data to third parties and try and then I see marketers trying to create more first-party relationships with individuals. So to me, I feel like that naturally is gonna the tail right into my observation. What are your thoughts?

John Arts  28:22
No, absolutely. That is the next step, we are seeing a $200 billion industry on third-party data, which is absolutely immense, which is, again, an opaque industry, people don't know much about it. We're seeing that on crumbling for the reasons you said organizations don't interact with them anymore for compliance risks, regulatory or when policymakers are behind them, and trying to do with the user Central and protect them. So organizations have moved to first-party data. And there are some really great ways for them to collect that. But what we propose, as an idea is zero party data. And the difference is really, that first-party data a user can be informed about the data collection, but there's no plague in that playing an active role in it. Right? It's a lot of the clicks are being tracked on the website, the brand knows Okay, there's an X percent dropout on the shopping carts, and can find a correlation between products they previously seen. What we envision is a zero-party data platform in which consumers are proactively sharing that with brands so that they are in control. They have the peace of mind, Hey, I know what's happening here and I'm doing this activity and brands then reward those consumers through any points of loyalty, whether it be cashback a discount, some brands have spoken about exclusive access to new product lines or events. And that exchange zero part exchange would really allow for a whole new ecosystem to be built. Users play an active role brands get the insights directly from the user in a compliant way and an engaging way. So that is really the win-win, that in the longer run our platform will enable.

Debbie Reynolds  30:16
Yeah. And I think too, you know, when customers have a good relationship with these businesses, they give more accurate data. So I know a lot of people give false data to companies they don't trust or, or they are some companies when they are doing this kind of third-party snooping. They're making inferences that may not be true about the individual. So being able to have that relationship. And that trust, you know, means that this person says, Okay, this brand is a value to me, and I'm willing to give them data that helps them help me.

John Arts  30:56
Right? Exactly. Because if it's a brand new trust, you're very happy that they improve their services towards you. And that's exactly the trust relationship, we're trying to build, the way we got it internally is really building privacy-first relationships in a customer-first era. So that is really changing around the whole mindset. And the brands that are fan of these are inspired by these are the ones that we discussed, are going to be the winners of the future, because they think about this privacy movement as an opportunity and not as a pain point.

Debbie Reynolds  31:33
Yeah, right. I know that people are holding on tooth and nail to try to keep some of these old models, but it breaks down, it's not gonna work in the future. So the faster people get onto this train, the better. They're going to be. Well,tell me a little bit about sort of, you know, we've talked about your, your mission, right, you know, with the company and the product. What is it? If anything that you would like to change about, you know, privacy or data in the future, what would you be trying to do?

John Arts  32:18
It's really transforming the way it's transacted. I think that is the root problem in the industry. So going through those third parties, is breaking consumer trust, limiting the full potential of, of data by all of these concerns, people trying to go through their settings really having been there dropping out or trying to do the data-centric requests, really suffering from, let's say, the complexity of managing your data. And then the brands increasingly have a need for new data sources. So what I would really love is, as we described, to enable an exchange one on one zero party in a trusted environment. Now, maybe on a more practical level, what we've we've found a difficult one is the execution of the right to be forgotten. A lot of consumers are shares, you know, writing out, right before sending outright to be forgotten requests to companies. And the response rate is really bad. They're trying to avoid this, some companies not to name any names are going so far, that if a consumer asks them, hey, please delete my data with the email address they use to sign up to the service. Some companies are asking, Can you please send over your ID card so we can identify you? This is absurd, right, because of data minimization. And then a company is asking for more data, really sensitive data through an insecure channel, which is email. So what I would love to see, and policymakers see this problem very clearly and will change is a more standardized way of invoking those rights, and increasing the response rate for the user, because that would also, on the one hand, facilitated for companies facilitated for users, but also enable a lot of new innovation in this space.

Debbie Reynolds  34:18
Yeah, I think, you know, there has to be a paradigm shift. And I think, especially the right to be forgotten, the right to be forgotten, is probably the most difficult thing that companies have to grapple with. Because a lot of their data collection, or even the applications that they use, you know, they're not meant to forget, they are meant to, they're built to remember. So some of these, some of these things makes it hard to even, you know, locate, even delete that data. So I think having ways that makes that data more transparent to the company so that they can fulfill these orders. Class is really, really important.

John Arts  35:03
And I think BigID does a tremendous job in that. Yeah. Yeah, well, I love them. I love their product for many years. And Dimitri, we're buds. So he's a nice, very nice man. And I really like the way that they're thinking about the problem. I had him on the podcast actually, not long ago. He's great. Dimitri Serota? who is the CEO of BigID. Tell me, tell me what is what makes your product different stand out in the privacy of the consumer privacy space. One thing I like to do. So I like the fact that you're you show people what their data is worth? So this is the big question, and no one wants to really answer. And some people are very curious where they don't think you should have find a monetary value to data. But data does have a value, right? Because companies are buying and selling data, right, left. So I think in a way, being able to sell that helps it helps people visualize or making more concrete about how you know data that they share, just willingly may be a value to someone else. What are your thoughts? No, no, absolutely, that is really a point of engagement for our customers. And where we are unique. It's the visualization, because, in the whole data ownership space, a lot of people have tried to crack it. And let's call it close to the data wallet value proposition. Not that many companies have really, it seems that they haven't really sat down that closely with the consumer to understand what is the needs? What can drive engagement, what do they want to know? And how do we solve that problem? And we were really close to their ears to the ground speaking to the consumer and finding out okay, what do they want? So the first step that is really unique there is the visualization dashboard is really engaging, people are sharing it with friends, people are sharing it either with fellow privacy interested people saying, Hey, have you seen this, but what is, to me even more touching is that people are sharing it for education purposes, they're sending it to their mothers, to people of their family who are not that aware about the landscape is saying, hey, you should really know this. You should be aware of this because it changes the way you interact with online services. So the visualization is, of course, that one, currently, there is no product out there that allows users to say both yes and no to companies, there are products that can say, hey, restrict your data from this list of company of companies. And there are others that say, hey, you can share it with companies. This is something that a consumer wants in a single platform, and saying yes and no. And that choice, and did optionality is really what is for us key to control. Right? Having the choice is what is meaning of control. So I think those two ways really unique and And finally, but leveraging two digit GDPR, for the aggregation of data and the restriction of data.

Debbie Reynolds  38:25
Yeah, I really love that. I like that you're doing that. Plus, you know, the way that things are happening now or have happened in terms of how data is being collected for individuals, so it's just not sustainable. So I saw a statistic from vital fast ID online that said the average person has like 90 different usernames and passwords. So, you know, can you imagine, you know, going into every single account and trying to do this privacy dance where you're trying to configure all this stuff is just too much. So I think the future is people having management on their data, almost like a bank, and then they pick and choose who has access because this is you just can't continue. You know, this way? What are your thoughts?

John Arts  39:19
No, no, absolutely. That is where it's going. And the way it should be going is in a very understandable and accessible way. Like the neobanks transformed the banking industry. Read as a data while it will also make it really accessible and hopefully transform the data industry. It's the wallet story, that's first, but the wallet needs to be packaged very nicely. You need the value proposition initially for the user to use the wallet. And then it's all about the transactions. How many transactions either saying no or saying yes, can you do on the platform. So we see that as the straight line to success, and to growing to a user base that we can provide value to as many people as possible with this product.

Debbie Reynolds  40:03
Yeah. So if it was the world, according to you, john, and we did everything that you said, What would be your wish for privacy, either, you know, in Europe anywhere in the world about technology or even regulation? What are your thoughts?

John Arts  40:22
Yeah, the biggest wish, what I used to always say, even a few years ago is that people go march on the streets for their privacy, right as they're doing for global warming issues. And that's been so nice to see it play out. In the last couple of years. Global warming was a societal problem, that became an individual's problem. Same with privacy. And it was always a societal problem, but the consumer never felt it was never tangible for them. And the last couple of years or even months, we've really felt that shift towards a personal problem, people are taking action. And that gap between consumer attitudes and consumer behavior has completely transformed people are now actively looking for ways to take control. So that, you know, would be I think, my dream that everyone becomes aware of the value they're sharing and risk-taking by interacting with online services, and that they then take control.

Debbie Reynolds  41:23
That's wonderful. Wonderful. Well, thank you so much, John, this was a thrilling conversation. I love the way that you're thinking about the problem. You know, me every consumer as well. You know, this is definitely interesting. And I like the way that you all are approaching the problem and your app is amazing. Thank you so much that it was a pleasure. So we'll talk soon. Bye-bye. All right. Bye-bye.