"The Data Diva" Talks Privacy Podcast

The Data Diva E48 - Jon von Techzner and Debbie Reynolds

October 05, 2021 Debbie Reynolds Season 1 Episode 48
"The Data Diva" Talks Privacy Podcast
The Data Diva E48 - Jon von Techzner and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds “The Data Diva” talks to Jon von Techzner CEO of Vivaldi Technologies and creator of the Vivaldi Web Browser, also Former CEO of Opera (web browser). We discuss his work on the early internet, the evolution of web browsing from the 1990s to today, his experience growing the Opera web browser to a 350 million user base, the fundamentals of respecting the data privacy rights of consumers, the challenges of inference, the problem with consumer consent, cookies, and web tracking, data monetization by companies who have consumer data, cookie cases in the EU, why he created the Vivaldi browser, the ethics gap with data privacy, the need for regulation, third party data starting and his wish for data privacy in the future. 

Support the show

40:03

SUMMARY KEYWORDS

people, companies, data, collect, Vivaldi, information, Internet, browser, privacy, inference, regulation, ads, consent, concept, thoughts, browsing, world, services, adapting, customers

SPEAKERS

Jon von Techzner, Debbie Reynolds


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. And this is "The Data Diva Talks" Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. Today I have a special guest on the show. I have Jon von Techzner, who is the CEO of Vivaldi Technologies. Vivaldi is an Internet browser. And actually, I was happy to see that Wired had written a very complimentary article recently about Vivaldi. Let's see what else the title article I have right here. That this is one of the best, no, you're probably not using the Web's best browser. That's the name of the title of this Wired article. So, Jon, you've had an illustrious career in not only browsing but just the Internet. So I would love to talk with you about, you know, how the Internet has evolved over the years is sort of your work. You're also the founder of Opera, the Opera browser, over you know, I think in like 1996 1995. And so that was around the time that I was getting involved in the Internet. So I remember what it was like, back then, where people couldn't find things, or he really needed the exact address or where you want to go before you could actually locate things on the Web. And so, you know, I saw browsing and back, then that's more of an innocent exercise where you'd be able to find, you know, it will be a helper to help you find things. But browsing has evolved over the years and has created companies creating more stickiness, to you know, web browsing, have taken advantage of it good and bad, right. So now we have things like, you know, cookies, and surveillance, advertising and things like that. So I would love for you to, you know, introduce yourself. And also, just give us more of a background of your journey. And kind of, you know, the age of the Internet age, basically.


Jon von Techzner  02:24

Sure, I'm pleased to be here, Debbie, and good to be chatting with you. So I started working with the Web all the way back in 1992. This is kind of very early times, and we were setting up the first Norwegian web server, actually one of the first 100 in the world. So we were quite early, and so have been there on the Web from the very, very, very beginning. Initially, we were doing things like just starting pages. We made something we, for the company where I was working, tailoring or research, which I think today would be called an Intranet. But the concept didn't exist at the time. We were making some search solutions. I made some tools to translate documents into the Web. So if you're using Framemaker, which was a popular tool at the time for writing scientific documents, you could translate that to HTML. And then, we decided to make a browser inside the research lab, and then we were allowed to take that out. So I've been basically doing browsers most of my life feels like and thoroughly enjoyed it from the very beginning.


Debbie Reynolds  03:51

Yeah, I think, you know, browsers have become so ubiquitous some people now think that a browser is the Internet is it's a way to help you find things on the Internet. Talk to me about the transition from, first of all, the Internet was initially made to help people communicate, right? Share information, and it was really meant initially to be, you know, open, right? Open so that everyone can be on it. Everyone can do different things. And then, over the years, it has transitioned to a lot of different things. So we probably didn't anticipate so we're you know, now we have things like surveillance we have advertising, cookies, things that do can do profiling fingerprinting for people on people. So what what what happened? How do we get here?


Jon von Techzner  04:51

Yeah, oh, please let me know if you want to be the purest, and you'd say the Internet was kind of designed way back and actually was initially made by the military and then for kind of was mostly used by resource institutions and universities and the like. So from that perspective, I've been on the Internet longer than the Web, the Web kind of started more like in the 90s, around 1993 years where things started to move slightly, with with the first popular browser more site. And I think I mean, initially, yes, the focus was basically OK. This is a place where you can place information and, and the early implementations of web browsers, basically what you had was simple documents. You didn't have applications. You didn't have moving things; you didn't have videos. And I mean, those things came a little bit later. But really, it for us always was, OK, we want to get as many people onto this as possible, because access to information is a beautiful thing and a big unifier in any way that people have equal access to information. So that's kind of been my focus over the years. And then we've seen certain companies because there's a lot of data on the Internet, and it's fairly easy to collect data, if you want to and you're if you're ethically challenged, then you will collect information. And I think that's part of the problem that certain companies that seem to think that the rules that apply to all those don't apply to them, they've started to collect information on those which are then used to profile us and give that information to advertisers. So they can send those customized advertisements. So we think this is a kind of a derailing overpass set up for a lot of us we're seeing, OK, the Internet is having a really a good kind of impact on the world, we will see more access to information, we see progress in the world. And I think for a lot of us, we've seen progress in the world in a positive way, overall, for a long time. And then, about ten years ago, certain companies decided to start collecting information which they had access to which they should have kept kind of private. And they started building profiles on theirs. And I mean, this is about what we do on the Internet, which pages we visit, what content review how long we are viewing certain content on a page based on our scrolling position, potentially information about kind of what we then what we click location information, including location information inside buildings, potentially what we are close to inside buildings, what we then end up purchasing, which they then get from the banks, etc., etc. I mean, it seems to me that basically, if the information is there, then it's OK to use it. And, for me, this is extremely foreign. From the very beginning, when I was starting to work on the Internet, the idea that if you had information on your customers, one, you should keep it very kind of private, you shouldn't really share it with anyone else. Because I mean, if you're, for example, if you are providing email services you will have through providing those services, you technically you might have access to some emails or the like from your customers, but you shouldn't go in and read them. Right. And you shouldn't be scanning them to find out where they're going. So you can show them ads if they're planning a trip on vacation. I mean, they're just certain boundaries that have been broken. And the worst-case scenario that is then being used for political purposes or inflation, inflation campaigns. And so all of this I still think the Internet is a beautiful thing. But I think we need to stop the bad stuff from happening. We need to stop the collection of information, and we need to reverse course on the kinds of advertisements that we see on the Internet.


Debbie Reynolds  09:15

Yeah, yeah, I agree with that. What what are your thoughts? Well, well, first of all, let me back up a bit. So your time with Opera, you were able to grow that company in the user base up to 350 million people. That's quite an accomplishment, knowing that's like the size of the US, right. Think about the number of people and so I think you know, people have choices right about the browsers that they use and you know, obviously some have more people No, no, some more than others. And then I feel now because of I don't know, I don't know what happened. Maybe it was because COVID in where people were being asked to share more information about their health, that I feel like consumers started to wake up about maybe privacy, and trying to really push back about, you know, wanting to understand what companies were collecting about them and saying, you know, we want to have more agency over our data. So are you seeing that shift where, you know, because I think the popular thought or concept that was out there for many years is that, you know, consumers don't really care about privacy? So let's just do whatever. And then, but now we're seeing consumers start to push back and really show that they, you know, they are they do care about privacy. So what are your thoughts about that shift?


Jon von Techzner  10:55

Well, I think I mean, there's a number of things here. And I think, yes, I think people definitely understand that there's a collection of information, and then it's not nice. But I think for me, there is a different angle to all of this. So people may say something that, you know, I have nothing to hide, I don't care, and I get the services are free. I mean, the propaganda works, right? The propaganda says that we need to share this information to get the services, which is a lie. It's not necessary. And that basically, this is only about personal privacy. I think, for me, it gets really ugly when you combine personal privacy, not of one person or hundreds of persons, but basically full populations. And then we are paid, basically put into camps, and you have companies that are analyzing the data and finding out that if you do less than that if, for example, you will walk a certain path every day, OK, you're a dog owner, if you're a dog owner, what about OK, how do you visit your parents often do you kind of they try to analyze maybe even from the music that you listen to, or something else, they place you into camps, which they then utilize to, to send ads that are relevant to you or to advertisers. And to me, where it's the worst, it's the concept of us getting different information. And that kind of where the fact that we are put into different groups, and then we're getting different levels of information. I think it's not a coincidence that this division that we've seen in the last few years has happened at the same time that this collection of information and this surveillance-based advertisement has prevailed. It's also related to the algorithms that basically look at OK, and you see this. Your friends are looking at this. And then the way the algorithms work, they choose content, which is your friends, choosing. And we all know that we click and look at things that we don't necessarily like, but it's there. And then it kind of becomes a self-sustaining system where we get more and more of that content. And then you can put an advertisement in part of that, and then you can influence this is to make it even worse. So I think we have to realize that there's a correlation between what we are seeing, the division in our society, and the use of this data. So it's the perfect programming machine. The way I like to look at it is that we made, we call it an API in the programming world, a programmable interface to us. And that's made available to the highest bidder to buy access to US-based on information that the system feels that we are like this or that. I mean, the fact that Facebook recognizes our political leanings, I think that's wrong. Facebook, this shouldn't be finding out what our political leanings are. And they may get it right or wrong or whatever, but it's just what's Correct. Correct. To collect data, it is natural to collect data. I think I'd like to ask people, so would you think it's OK that your mailman read your mail? Is that OK? Maybe that the fact that you kind of got certain bills in the mail, and then suddenly you get kind of hints to someone that might be you need a bank loan? Is that OK? Is it OK that they listen to your phone calls? That the phone company would listen to your phone calls? I mean, is it OK if someone comes and does your cleaning or washes your I mean, or kind of is painting your walls or something? Do you get someone working in your house versus for some reason? Would you think it's OK that they put a recording device and listen to your call discussions? Maybe it can be, maybe a list of all your furniture. I mean, technically speaking, if this was the digital world, this would all be easy to do. And sadly, this is what is happening in a lot of cases. And to me, this is just wrong. I think these companies, and they're basically telling themselves, no one cares. And then they try to convince everyone else that no one cares. And they try to convince themselves that no one cares. But I think we do, and then I think they should.


Debbie Reynolds  15:32

Yeah. So let's talk a little bit about it well; first of all, I want to talk about inference. And then I want to talk about consent, right? So. So one of the things that trouble me is how extraordinarily dangerous inference can be. So based on information that's collected about individuals' inferences, our inferences are being made. And as a result of those inferences, decisions may be made about people. So it's not just advertising, so you know, maybe something, you know, like your example, someone walks, goes to take a walk for a certain time, or they and they say this person is a dog walker, maybe I'm going to give them coupons for you know, dark things, or whatever. So that to me, and in advertising, even though it definitely has a creepy factor. You know, that's different than someone saying, well, we think because this past one wants, their, you know, they walk in, in a neighborhood that has high crime. We think this person is a criminal, and then I'm going to deny them employment or some other opportunity because this inference is there. And, you know, we don't know whether there's true or not, but we want to, like, you know, lower our risk, and we're going to sell it to like an insurance company or an employer. What are your thoughts about that?


Jon von Techzner  16:52

Yeah, I think I mean, it is really dangerous. And I've seen companies that are looking at ways to look at your data that they have access to one way or another, and try to make it fair kind of how can the person you would be working for them. And I think that can be dangerous. I mean, depending on maybe what music you listen to when like you say all the patterns that you might have. I think that's a dangerous thing to do. I think it will be quite often wrong. And I also think it's kind of belittling us. It's making us smaller, and we're just people numbers, kind of inside groups. It's not treating us as the individuals that we are. And I think there's something seriously wrong with that as well.


Debbie Reynolds  17:43

Yeah, so let's talk about consent. So consent concerns me a lot. Because of consent, people can consent to things that are not in their best interest. And then the comfy V is the value exchange between the value that I get as an individual and the value of the company gets as asymmetric. So, you know, they can companies can get more value from me than I can get from them. And then if I'm consenting, especially if I'm not 100%, clear on what are consenting to, you know, that could create trouble for me or problems for me down the line? What are your thoughts is about consent in general and the kind of transparency around it?


Jon von Techzner  18:27

Yeah, I think I mean, this is a significant problem. And we've seen this with GDPR. With the idea of the cookie dialogues asking people about consent, are you OK? It's similar. If you're installing an operating system, are you OK with the kind of tracking you and their liking? And reality? Quite often, you don't really have much of choice. It's kind of like, OK, take it or leave it. Do you want this operating system? That means we track you. I think there are certain things you shouldn't be allowed to ask for; I don't think you should be allowed to ask for the right to surveil your customers. I think it's insulting to your customers. And so there is no reason that because you're providing some kind of service that that should give you information about the customers that are using your service. I mean, that doesn't mean that you can't do things in aggregate for us. I mean, as an example of this, OK, your data when you're driving could be useful for a mapping service. So it can provide the best route that's useful. That doesn't mean that that information needs to be collected on you specifically over time or on certain groups as well. So it's just a question of, there will be a lot of data. The question is, Are you allowed to use that data for purposes other than providing the service? And to me, the answer to that question is no. You shouldn't be allowed to use the data for other things, particularly not for building profiles on your users because it's wrong.


Debbie Reynolds  19:58

Right, right. So what are your thoughts? You know, you mentioned GDPR. And actually, this is about the E privacy directive. What are your thoughts about the sort of cookie cases that are happening in Europe right now? So I think, I guess for me, I feel like just focusing on cookies isn't sufficient, right? Because there are other ways to achieve the same type of tracking or surveillance, it doesn't require cookies. So if we're focusing too heavily on cookies, we're not looking at the bigger issue.


Jon von Techzner  20:36

Yeah, I mean, you can collect data in very many different ways. And, you can, I mean, you have companies that have access to most other most sites, so you have things. Well, there was the Google FLOC case, which is an interesting one. So the concept is that instead of collecting the data on the servers, you will collect this data on the computer, which will collect even more data on you, but store it on the machine, and then send it to sites when you visit them. So it's a different way of doing it, it's a different technology, it's coming down, instead of using cookies, it's all collected on the computer, which the result of that is giving Google more unique rights to the information on who on you. So it's not solving the problem. It may be in some way to it's just giving priority to one company.

I mean, again, in my view, none of us should be collecting this level of information. None of us should be building profiles on our customers. And it doesn't matter where it happens, whether it's using this technology or that technology, some of us will have access to a lot of information on our customers. And we should kind of treat that to date the data with the biggest care and make sure no one gets access to it. And we shouldn't misuse it in any shape or form. Because it's, it's not our data. And I don't think it should be ours to buy it. It's just one of those things, and it feels wrong to ask for this. This is you don't ask people for private information and think that's OK. And it's something that you can just collect, and I just think it's wrong. And it's one of those questions you shouldn't be allowed to ask given.


Debbie Reynolds  22:24

Yeah. And back to your point about sort of the way that browsers collect information, and they sort of, sort of categorizing people try to put people in buckets or categories. But when people don't neatly fit into categories, that's kind of where bias can show it's here, right? Where you're making an assumption about someone that doesn't really fit, then there may be accidents, adverse actions taken towards you as a person as a result of that.


Jon von Techzner  22:55

Yeah, I mean, it's clear you can land in the wrong groups. But also, it's a question of, should we be grouping people in this way? Should we be treating people differently based on some information? It feels wrong in this time and age that we have those concepts? And I think I mean, and it's problematic, then when, when you have a situation where political campaigns may be sending out 10's of 1000's of different ads to people every single day with those changing on a daily basis. Where's the accountability? Where's the question of OK, are you? Do you have one message? Or do you have 10,000 different messages to people? What is the message? Why should we be voting for you? There's a lot of things like this, which I think is really unfortunate. And I think the easiest way to reverse that is basically to say that you can't, again, you can't be collecting people this way. So if you're advertising, then your advertising in a way that is seen by all the public in our region and the like. So it's not kind of this very related to a certain group of people. I mean, I don't see ads as something that we should get rid of. In general, I think I mean. There's a lot of good things that are funded through ads. The Internet was actually funded through ads in the beginning. But the difference was that the ads were the same kind that we will use from other publications. So you would see relevant ads to the articles that you were reading or to the magazines you were reading. So you go to a tech site, you would see tech ads, instead of what we've seen now, where you happen to go to a site. And now you see ads relating to products on that site for the next two weeks, wherever you go. Not really useful. And yeah, I mean, again, I'm rambling a little bit on these things, but I think the whole system is so broken. We had a working system, certain companies that wanted to make even more money, and they broke the system, and they needed to reverse their actions.


Debbie Reynolds  25:08

So tell me a bit about why you decided to create Vivaldi, the browser, and how that relates to people managing Data Privacy?


Jon von Techzner  25:21

Well, I mean, for me, actually, it is, in some ways, it's a combination of what we were doing at Opera during my time there. I mean, Opera has gone in a different direction since I left. But the concept of adapting to the needs of users, right, so an Opera that went from the very beginning adapting to the needs of me first, it was a question of disabilities. And just the concept of that we are all created, kind of equal but different. So we have different needs and different wants. And then it's natural for us to adapt to those needs. And it's not a question of the minority. I mean, you have this telemetrics that is being done by our competitors, where they find that what most people want, and that's the thing that they do. Well, in our case, we think the lonely voice is also, I mean, they should be listened to. And listening to the to every user and not just some minority rule, minority rule is not a good thing, if it presses on the, on others. We did things than to adapt to lower kinds of cheaper hardware and the like. And then we did for mobile phones that it could run on just about any mobile phone. So you could browse with limited phones in Africa, in India and the like. And I mean, we took a lot of market share in those places, and in many ways have evolved the concept of what we are doing that Vivaldi is a similar one. It's adapting to the needs of the individuals. So we have a lot of flexibility in the browser. And as part of that, because we are thinking about what is best for the user. We're also putting in things like tracker blockers, ad blockers, and also just the fact that we are not engaged in the collection of information. I mean, I think, in a lot of ways, you take that for granted, but I think it's kind of turned around in this world of ours, where it's almost people take for granted that you are collecting information. And if you're saying you're not collecting information, you're probably lying about it. So that's a sad, sad situation as it is. So, for us, we think this is it's important to do the right thing. We see our users like our friends. And from that perspective, we shouldn't spy on our friends, and we should try to keep them safe.


Debbie Reynolds  27:49

Yeah, and what are your thoughts? What are you saying about how technology, if at all, is responding to these regulations like GDPR or different things related to kind of consumer or human privacy rights?


Jon von Techzner  28:06

Well, I think I mean, companies are striving to comply in general. I think there are exceptions. Some of these big companies are trying to get around kind of the rules. But I think, personally, and I think a number of us agree on this. And we did send in letters to that effect, both to the EU and to the US government, that we think this isn't going far enough. We just think that kind of saying OK to dialogue isn't a solution. We need to ban the collection of data. And the use of data in this way. There are surveillance-based advertisements; we need to stop putting ourselves into groups of people and providing ways to address those different groups. I think it's a dangerous thing. And that's what we would like to see regulated, and then I don't think there's really, obviously, it's helpful if people kind of vote with their feet and they go use sore some of the other companies that are taking set the same stance as we are doing. But in reality, sadly, at this time, most people are using products from these companies that are collecting data companies like Google and Facebook and Microsoft, and they're like, and I think regulation is needed. And I think maybe it will help those companies do the right thing. Some of them are asking for regulation, but all their hands, they're lobbying for the regulations to be in such a way that it's they're still allowed to do what they're doing. But then they can say they follow the regulation. But in reality, they're still doing the damage.


Debbie Reynolds  29:50

So in terms of, I guess, not just regulation, let's talk a bit about it. All right. So, unfortunately, not all laws are ethical. And we know that there's kind of an ethics gap there somewhere. So how do you think I don't know? I don't even know, and we can bridge the gap. But I am seeing people in things like AI and ethics and things like that, who are giving voice to, you know, some of the things that you're saying, you know, even though even if some of these things are not against the law, they are wrong, and maybe they shouldn't be done. So what is your thought about companies kind of embracing or looking at ethics? So there's kind of beyond what the law says.


Jon von Techzner  30:37

I think I mean, and this is what we mean; that's what I believe companies should be doing. I mean, we should all be trying to think about, OK, are we possibly contributing to society in a positive way? Are we doing the right things? Are we kind of going through the boundaries of what's right to do or even close to them, and sometimes you just make a choice. That, OK, it might be legal to do this thing, but it's the wrong thing to do. And I think, in most companies, I'd like to believe think that way, most of the smaller companies, they'd like to do the right thing. They want to be good citizens. And I think if some of these larger corporations, something strange happens, they may be under pressure, and then they're looking at OK, the other guys are starting to do this. Well, if the other guys are doing it, then it's OK. I mean, I'm trying to think how they conclude that collecting information on their customers and putting them into groups and making it available to anyone that wants to buy access to them, even for political purposes. Why that should be OK. I don't know. I mean, sometimes I've tried to understand it, was it basically because like with Facebook, was it that they mean, there was a time they were with Mark Zuckerberg was under pressure, said he needed to show that they could make money because they weren't making a lot of money. Did he then make the wrong decision? Or is that how it happened? I don't know. I'm not in his brain. And there's obviously those that say, OK, this is just how it was planned from the very beginning. It's basically you build something for free, and then you kind of when people get used to it, then you start to do things that are not OK. But I don't know, and it's all from that perspective. It's speculation. I think the main thing now we have companies that are doing things which they know they shouldn't be doing. And I think it's maybe difficult for them to get back on the straight and narrow, and we need to help settlement. Hopefully, these companies can be kind of brought on to the straight and narrow and be good citizens again.


Debbie Reynolds  32:50

Yeah, what are your thoughts about regulations that are trying to rein in third-party data sharing, so I give data to one company. Then the company gives it someone else, and they try to create more transparency, at least to get the consent of the individual, in some cases, for that data transfer to occur? And then having more visibility will be who those third-party companies are? Do you think that would be an effective thing?


Jon von Techzner  33:21

No, no, no. I mean, well, I think if you go back in time, right, then the concept of sharing data was, you wouldn't do that. Right. And I think in a way, in particular, you may have a company that's contributing to your services. I mean, this could be Google. For example, with Google Analytics, the fact that you get a piece of software for Google to Google Analytics doesn't mean that Google should be able to use that data for other purposes. I think I mean, analytics on a page from a company is in itself, probably not that problematic, in many ways, and sometimes you actually would like to see that they recognize you as a person in their life. So if you purchase something, they will know that you purchased it and they'd like it. And as long as that's what it's been used for, for you to help you. When you get into the site, that's one thing when they then share it with other companies. And they start to use that to show you things that you don't want that then it gets really murky and ugly. And so obviously, keeping that data limited is helpful. But again, the concept thing, I don't mean concept. Words, but basically, it shouldn't be a question of just OK. If you get an OK from the users, then everything is OK. And I mean, most of us, we just click the thing, OK, there's a dialog, we click it. We don't really have much choice. I mean, it's kind of like going to the bank, and the bank asks you to sign In this paper, which is 20 pages or whatever? I mean, it's not like you can say who you know; I didn't like that sentence on page 15. Could you please change it for me? They basically say, No, no, no. OK, you go find yourself another bank. And you'll find that the other bank actually has a similar or the same clause, the position of power is clearly with those companies, you don't really have as a user have a choice. And even if you had a choice, I just think it's wrong. They just again, they shouldn't be sharing the data in this way. It shouldn't happen.


Debbie Reynolds  35:42

Yeah, so what will be your wish for data privacy in the future, whether it be with law, technology, people, anything anywhere in the world?


Jon von Techzner  35:55

I think I mean. We need laws on this. We and I think in particular, we need them in the US and Europe. I think that will help because a lot of the companies are in the US and Europe. And then we need to have any companies that are on the Internet to be following certain rules and regulations on what you're allowed to do with data and the like. And I think it should be very limited. I mean, you may have a lot of data on your consumers because you're providing certain services. I think that's what comes with the territory in some cases, but that doesn't mean that you can use it to build profiles; you can share it with others. I mean, so I think it's a question like that. And, I mean, sharing it with all those for creating profiles. Again, there may be situations where you're using software, and you're using services and the like. But again, it's a question of, does that give that all the service access to the data so they can use it for other purposes? And the answer is no. Obviously, you need companies to be able to work together, and we shouldn't be getting in the way of that. But, but again, ownership of the data is basically the users. And data shouldn't be misused in any shape or form. I think that's basically the important thing here, and that you shouldn't be able to ask for permission. It's, it's not right.


Debbie Reynolds  37:17

Yeah. And also, I think, you know, we, we're starting to see because people are caring more about our showing that they're caring, more about privacy, that privacy can be profitable for companies. So if you help consumers protect themselves, they will want to use your products.


Jon von Techzner  37:39

Yeah,  I think many people care about that. So we will, yes, we are seeing that people are caring about privacy. I think the issue that we have in particular is that the big companies they that you can't really get away from them, right? I mean, you're not going to use a mobile phone, OK? Not want to use a smartphone; you don't want to use a PC. So but it's good that people are choosing companies that are listening, that they don't want to be treated like the merchandise, right? They want to be treated as customers and with respect. And there are people that are choosing to switch to companies that treat them that way that treats them with respect and don't just treat them as some merchandise that they're buying and getting access to and, and selling. So yes, there are people that are going in this direction. I think the problem that we have is that even if you wanted to, it gets really, really hard to get away. But there are ways I mean, and you can select like Vivaldi as a browser. You can use Linux as an operating system. You can use a good search engine, like I mean start page, so Duck Duck Go, Niva, or the likes that are kind of not collect your data. So there are choices for people. And I think there are more and more people that are seeking out those alternatives. But I still think we need regulation.


Debbie Reynolds  39:09

Excellent. Excellent. Well, thank you so much for being on the show. This is great. I know that the listeners would really love to hear your point of view and perspective, and I'm really happy that we got a chance to have this conversation.


Jon von Techzner  39:22

My pleasure. Thank you. Thanks, Jon.