"The Data Diva" Talks Privacy Podcast

The Data Diva E223 - Luke Mulks and Debbie Reynolds

Season 5 Episode 223

Send us a text

Debbie Reynolds “The Data Diva” talks to Luke Mulks, Vice President of Business Operations at Brave Software and host of the Brave Technologist podcast.

Luke shares his unique journey from working in ad tech to joining Brave Software, where he now champions a privacy-first approach to browsing and online advertising. The conversation explores the profound issues in the ad tech ecosystem, the challenges of data over-collection, and the opportunities for innovation in building user-respecting solutions.

Luke discusses the evolution of ad targeting from domain-specific ads to intrusive audience tracking across the web, underscoring how these methods have fueled surveillance capitalism. He explains how Brave Software takes a fundamentally different approach by respecting user privacy as a default setting while creating viable business models. By leveraging privacy-preserving technologies, Brave has proven that companies can monetize digital ecosystems without compromising user trust.

Debbie and Luke address global privacy controls (GPC) and the broader regulatory landscape, highlighting its limitations and the challenges of relying on companies to act in good faith. Luke emphasizes that privacy solutions must be built into the technology—“an architectural and foundational approach”—rather than depending on users to opt into protections or companies to self-regulate. They also discuss the interplay between privacy and competition, exploring how dominant tech companies leverage monopolistic control over browsers, operating systems, and ad ecosystems to stifle innovation and consumer choice.

The episode cexplores concerns, including cashless societies, financial privacy, and AI-powered content creation. Luke explains the importance of digital tools that preserve the anonymity of cash in a digital environment, warn and warnsst the risks of hyper-centralized financial systems. He also reflects on how AI disrupts privacy, advertising, and content integrity, underscoring the need for better tooling and ethical governance to address emerging challenges.



Support the show

[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello.

[00:13] Hello. My name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks Privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:25] Now I have a very special guest on the show, Luke Mulks. He is the Vice President of Business Operations at Brave Software Privacy Respecting browsers and makers of the basic attention token.

[00:40] He is also the host of the Brave Technologist podcast. Welcome.

[00:45] Luke Mulks: Thanks for having me on. Been looking forward to this.

[00:48] Debbie Reynolds: This is great. And I've been a fan of Brave for many, many years. I knew I know Johnny Ryan when he was with Brave many, many years ago. And so I really champion what you all are doing and I'm really happy all got a chance to connect, but I want you to be able to introduce yourself, let people know who you are and what you're doing and how you came to your position at Brave.

[01:13] Luke Mulks: Yeah, no, it's great. And before I jump in, I love Johnny Ryan too. Worked really closely with him when he was here at Brave and like especially on a bunch of the work he did with the regulators in the EU around GDPR and all of that.

[01:26] Like great, great guy. Yeah. So I started working with Brave in March of 2016. So it's very, very.

[01:33] And came on full time later that year. I was really working. I had worked and started in publishing initially and then got into media production and ran some startups and then got into the advertising industry.

[01:47] On the ad tech side in around 2011 was when I jumped right in there. And I'd known a little bit about, you know, monetization and advertising from doing startups, but it was a really interesting time too because I worked for a company called oao and we were basically like an agency that worked between the big ad tech stacks like Google and Facebook and you know, Freewheel and all these mega companies that handled all the ad products.

[02:14] And then we worked with the major media publishers, so like NFL or PGA Tour and you know, Comcast, Universal, all of that. And basically like these companies were huge and they used advertising to monetize a lot of their business.

[02:29] But what you'd find in these places is that their developers really just liked working on development of their product and didn't really like the ad tech stuff very much. So our company basically like plugged all the holes and got all the money flowing with the ad products, which meant that from my point of view, like, as a director of ad products there, I worked on all the ad products, right?

[02:52] So, like, we worked on behalf of the ad tech companies and then on behalf of the publishers, too. So you really can see the whole spectrum of everything. And. And at the time, it was just bananas because it was when Google and Facebook really started to put programmatic advertising everywhere.

[03:08] And advertising really used to be. When I started there, advertising was more like, okay, your property is this. Or, you know, you're the NFL, right? Like the NFL. People are going to your site, we're going to cater ads to your website, and we would do a lot of work like that.

[03:24] But what started to happen with Google and Facebook taking more and more of the market is they started saying, well, we're going to focus less on targeting the property and more on targeting the people.

[03:35] And so they started to do this audience targeting. And that's when you really started to see the data collection part get bigger and bigger because these ads went from being served to an individual domain to, like, targeting these cohorts of users where everywhere they went.

[03:51] So they had to be able to collect data everywhere you went. And it was really. From 2011 to 2016 was really when this stuff proliferated and got really huge. And you saw Google.

[04:02] Google taking more and more and more of the advertising ecosystem under their wing and everything kind of working within that sphere. And so I was in a position where I kind of was not feeling very good about what we were doing anymore because it was just like the volume.

[04:19] I think people really underestimate the scale of the data collection and how much of it there is. And now if you look back, how long it's been going on, like, the stuff I started working on in 2011, I mean, that was over 10 years, years ago right now.

[04:35] And you've got. You've got over a decade's worth of data on people, on everything that they browse and the stuff they. Companies say that they do certain things, but you can't really trust, because I can't tell you where the data stops, right?

[04:49] Like, where. Because once something is collected, a lot of these services will have a different ID that they associate to you, and then they copy that stuff. So even if the original collection view is gone, a fragment of it's there and it's linked to something else, right?

[05:03] So it really got to the scale where I was just like, this is like, we're talking about billions of people here that you're tracking all the time. But the problem I had was that, like, one, I burned through a bunch of startups on my own.

[05:13] So I was kind of, like, reluctant to go do something else unless it was a company that I believed in. But also, there weren't very many companies that were saying, look, like we want to build something that can be monetized but respect the user's privacy, right?

[05:29] Like, there weren't very many solutions, if you think back to 2016, or companies that were starting new things that actually, like, were more than just kind of breaking the tracking, but actually, like, trying to be proactive in trying to say, no, you can still have a business model work without having to collect the user's data.

[05:48] And then I saw what Brave was doing in 2016, and it really jumped out at me. I was like, okay, one, here's a team with a lot of these pioneers to the web.

[05:58] You had, like, you know, Yanzu on the security side and Brennan, Ike, and a lot of these people that really kind of buil built, created the languages and built the frameworks for the Internet that we all use.

[06:08] And then too, I mean, like, they were talking about actually making business work with this. And so I reached out to them in 2016 and said, look, you guys are the first thing I've seen where somebody's trying to proactively make a new business model for the web that could actually work.

[06:24] Because I think, you know, people don't realize too, like, there were solutions out there where they. Where they were dealing with ad blocking and things like this, but they were trying to kind of subvert user choice.

[06:34] So you'd go to a site and they would still try to show you an ad, even though you had an ad blocker on, which I was not a part of.

[06:40] We had some clients that were doing that at my old company and just wasn't appealing to me. But Brave was said, no, we're going to do something new. Like, we want to make it so that we protect the user's privacy by default.

[06:51] And then we want to create business models that will work with that dynamic. And we think that the technology is available for us to start to build on this and do this.

[07:00] And I said, okay, that's where I want to go work from. And I didn't really know coming into it. I'd never worked for a browser before. And, you know, now we have like a browser and a search engine and all of that.

[07:12] And it was just like, very new to me, working for a browser. But once I started to work with the browser, it was like, wow, okay, this is such a powerful tool that people don't realize that they have where everything you do is in the browser, right?

[07:26] So like if you think about your browsing history, your payment methods, your passwords, your everything is in your browser. And so what really started to jump out at me more the more I looked into it and the more I worked with the browser was, wow, this is a.

[07:42] We don't need to have these companies collect your data anymore. If we can figure out ways that can use the data that you have on your device without leaking it to anybody, and if we can blind ourselves from that as an organization, then you can have this really new opportunity to actually like have effective, effective business models that don't collect the user's data at all.

[08:05] And that was stuff that really got me interested in this because nobody had really tried on the privacy. This was before GDPR was a thing. Like nobody. Like I remember, like I would go into meetings and people would kind of laugh at us about like no one cares about this like, or privacy or privacy and advertising.

[08:21] What are you doing? Why did you waste my time? That kind of stuff like back in 2016, 17. And so it was a, it was a use case that really jumped out at me.

[08:29] But there was another thing that Bray was talking about doing too, that was really interesting was they were talking about making it so that users could actually get a revenue share if they wanted to with their attention, which was something where it took me a minute to think about it.

[08:42] But then, you know, really, really basically it's like without your attention, there is no advertising ecosystem, there is no attention economy, right? And if you think about it, these companies have gotten to be so huge.

[08:57] Like some of these tech companies have more revenue than the GDP of some countries, right? They built that off of collect and processing and storing and sharing and blah blah, blah, user data, right?

[09:09] Like your information.

[09:10] And when Braves started talking about this concept of like let's give users a revenue share with their, for their attention, that was another thing that really kind of got me hooked on what they're doing because it's about first principles with this stuff.

[09:25] And if you think about it like these companies have basically built a user hostile model that is scaled. And what Brave is doing is we're saying look like as a new actor into this space we can preserve, we can make something that's going to keep the user's interests as a priority.

[09:46] So like that means is the user's privacy protected? Is this something that the user has control over? If it is great, that can live with Brave. If it's, if it's going against the user's interest.

[09:58] We're going to block that in Brave. Right. So really thinking about it more than just privacy, but also the user interest, if you think about the browser is technically called the user agent.

[10:09] Right. If you look at how data is processed. But the user agent has no longer become the user's agent because it's being used by the companies that are getting in between the user and their browsing.

[10:21] So what Brave's really trying to do is restore that agency for the user, whether it's at the browser level or the search level or whatever. And that's. We, you know, it's 2024 now, almost 2025, probably by the time this goes out.

[10:34] And we've, we've kind of taken a lot of theories we, you know, or hypothesis we had and turned them into global use with people that are making a movement out of what we're doing and kind of proving with their feet that they want to move to a model that kind of puts their interests first.

[10:54] Debbie Reynolds: Very good. So what, what is happening now that's concerning you most with the ad tech ecosystem or regulation or anything that's happening in privacy?

[11:05] Luke Mulks: Yeah, I think a couple things. One, it was really kind of disheartening to see Google is Talking about like GPR kind of became a thing. Right. 2018 and then really started to take hold 2019, 2020.

[11:19] And Google started to take a position of like, okay, we're going to get rid of third party cookies in Chrome and we'll finally take that step. And they were trying a few things, being able to serve, you know, ads from the browser and things like that.

[11:33] But they recently said we're not going to do that anymore. We're going to, we're not going to walk back on third party cookies. Like, and it's like a little bit of a, I mean it's definitely a regression.

[11:43] I think that that space GDPR was great in that it helped to define what user data is like, get you a practical definition. Because if you don't have somebody defining that from that regulatory level, you're really going off of the word of either academics or what the big companies are saying privacy is like.

[12:04] If you look at the US right. Apple is advertising itself as a privacy solution. Google says things like privacy is paramount to us. All these companies are the ones that are saying what privacy is.

[12:14] But you go to look at Europe or working in Europe and they say, no, here's the definition of what user data is and this is how we care about it.

[12:22] And so GDPR kind of applied to everybody. But the problem with GDPR is that either the regulators don't have enough of the brain trust to enforce it properly or they just aren't going to enforce it.

[12:36] Right. Like, and so you know, you mentioned Johnny Ryan earlier. I worked with him on a bunch of, on a bunch of efforts around this with trying to kind of show the regulators back in, I think it was 2019, 2020 where we were showing them the ICO.

[12:49] We were like, look, this is a free fire zone for user data in this real time bidding system with advertising. We presented a ton of evidence around this and they even agreed with us.

[12:59] But then it ended up being a toothless thing because nobody was willing to take on the companies about this. And I think that, you know, it's a challenge with these things with privacy regulation.

[13:10] The real thing that I'm kind of getting concerned about is, I mean you've seen some moves like we've been able to grow to, you know, 80 million monthly active users and 30 million daily active users globally.

[13:21] Right? Like that's like a, a lot of people that are saying we want to have a, we're, we're going to move to a privacy preserving solution. But then you look at like Proton's got like, you know, 50 to 100 million users.

[13:30] You see other companies that are building privacy tools that are getting adoption now. Signal is another great example, like a ton of people using it, right? And so that's great.

[13:40] The thing area I'm looking at right now is a lot around financial privacy, around financial transactions. And there's some recent events where governments have kind of gone after people building privacy tools for, for helping the people to have privacy transactions.

[13:54] And I'm, I'm seeing things like, you know, I take my kids to, to yogurt land, right? And they say cash not accepted here anymore. And it's like we need to have a, a transaction medium that gives you that kind of cash level privacy anonymity that's digital.

[14:11] And you know, there's zcash, which is great. But I'm getting pretty concerned now that you're seeing a lot of this CA system kind of showing up and you're seeing, you know, not a lot of good solutions that have parity with what you can do with cash.

[14:25] And people are starting to kind of cave on these things and say, well you know, you can't just let anybody have, you know, financial transactions with privacy because you'll be supporting crime or whatever, whatever, like kind of scapegoats.

[14:38] And I Think it's an area where we have to be like really vigilant. But I'm really excited about the tooling that's out there now. Like with privacy in general, I think you're starting to get more and more people turned on to how this is important.

[14:52] I think unfortunately it's one of those things where so many people get impacted by things like data breaches and start to see kind of the dark side of what happens when companies have so much of your data.

[15:04] It's something you didn't used to see as often at all. And so I think that some of that kind of creeping in and then just the sheer volume of data that is out there, I think people are starting to kind of wake up to the hack with also just, I mean, candidly, right.

[15:19] Like there are some real benefits to having privacy preserving software. Like we get a lot of users on rave by saying like, look, if you Want to watch YouTube without ads or you know, or have like, you know, a faster browsing experience or have less data consumption, right.

[15:33] Like all these things that are. The adtech ecosystem really weighs your device down by trying to run things all the time that are trying to collect your data. So by having these privacy tools, not only do you have better privacy, but also you have faster load time.

[15:48] A recipe website actually looks like a recipe. It doesn't look like 500 things being sold between the steps, right. Like where you're, you're really struggling to fight the ads to get to the content you want.

[15:59] And so I think that there's a mix of things happening where the industry is almost eating itself in a way while the tooling is getting better and it's getting more usable.

[16:09] And I think that's been a real problem in the past has been that there have been good tools out there, but they break so much of the web that you can't really recommend them to people because their, their bank website won't work or whatever.

[16:21] And those. And people don't want to have to use more than one thing for something they want to. We're competing with technology that's saying, hey, you can just scan your palm at Whole Foods and pay for your groceries.

[16:32] You know, like that's where it's got to be that convenient where. And now we're kind of getting to the point where stuff's starting to get that easy, which I'm super excited about.

[16:39] I mean I think there's a lot of negative things that people can get worked up about, but I think that the positives are just, just you're starting to see the market move towards privacy, like almost at every level.

[16:50] And it's going to be up to more companies to really take an aggressive push on this. And I think that a lot of them are going to have to be new companies because a lot of these big companies are kind of stuck.

[17:00] They can't put the toothpaste back on the tube to say, so to speak. Where, you know, they have basic. Like if you think about Google, Google is a publicly traded company, right.

[17:09] Like they have been getting convincing shareholders for years that their advertising model is the best and it's worth the premium. They can't all of a sudden say, hey, look, this thing we've been doing is not really great for privacy at all because they'll get sued by their shareholders.

[17:23] Right. They're kind of stuck in this dilemma. And they also just have got this whole ecosystem, that house of cards that they've helped build and they can't go negative on that or the whole house of cards falls down and they don't do so well.

[17:36] So I think that it's one of those things where you really have to take some pioneering chances with new companies and those new companies have to be forced to build stuff that's easy to use.

[17:46] So I'm pretty excited about that. Yeah, I know it was really long winded answer.

[17:51] Debbie Reynolds: No, no, that's fine.

[17:53] Luke Mulks: Yeah.

[17:53] Debbie Reynolds: So I want your thoughts about the shift that you're talking about. I want to dig a little bit deeper on that. And so this is kind of my feeling. So I know that a lot of organizations, especially the big incumbents, they are kind of anti regulation or, or only want regulation that benefits them.

[18:14] Right. But I think we've gone beyond regulation. Isn't the only thing that companies need to worry about. They need to worry about what their consumers want. Because I do see people who maybe I never thought cared about privacy, them switching to different companies because they say, I don't like the way this company handled my data.

[18:34] Right. So I am seeing people start to move and make decisions with their money. Just like you're the thing about the cashless. You know, I've seen people walk out of places because they're like, well, I don't want to use my credit card.

[18:49] You know, I don't want to be tracked if I can't use cash. I don't want to like patronize this company, but I just want your thoughts. What is your feeling there?

[18:58] Luke Mulks: Yeah, I think you're starting to see a movement on these things. I think that a lot of it's going to come down to adoption still. Of like, I mean, it's hard, right?

[19:09] Like, these companies, these incumbents with regulation, like, a lot of them, they can either fight it out for years in court, they can drag it out. I mean, like, that's one of the negative things about these tech companies, they're so huge, is they can, they can spend years, years litigating over these privacy things for ultimately what ends up being a fine that they can, by the time the process is over, they've made that, you know, multiples of what the fine is.

[19:36] They're not worried about it. And so I think there's this idea of regulatory capture where they kind of work too closely in some circumstances or you end up with toothless regulation too, where it's like, okay, yeah, like, it is good to know that this website is going to have 250 companies tracking my information.

[19:57] Like, but like, what does that do for the user? Like, I think we need to have better tooling and more of it or just more adoption of the tooling that's out there.

[20:06] And so I think that, that some of this is just not going to be solved by regulators. Like, and this is a global ecosystem that we're in on the web.

[20:14] And I think that where you start to run into problems with regulation too is one, there's been, I mean, we're pretty active in the cryptocurrency area of the spectrum where you've had a really kind of strong armed regulatory environment where it hasn't been very clear.

[20:31] Like GDPR is very clear. People might hate on jdpr, but at least GDPR tells you what the thing is. It gives you a framework to work in. If you look at something like cryptocurrency, like they're working off of regulations that are like a hundred years old, right?

[20:45] Like, where they're like, oh, this was a court's, a judge ruling from like on orange plantations or something like that with securities law from like 100 years ago. And we're applying that to cryptocurrency in 2024.

[20:58] It's insane. But anyway, like, it's a much more, a much different environment. So I think that with regulation it's helpful and. But you can't over regulate it. And when there's new technology too, you really have to like, look at it from a perspective of letting the market get fit before you start to regulate it because otherwise you're not going to be able to innovate at all because you're going to be held back.

[21:19] Like that's one thing we've seen with the cryptocurrency side. Like, I've seen businesses just go under because they can't afford to have the legal representation. And then the legal representation tells them, hey, you can't do this, this or this.

[21:30] And then all of a sudden, and you're having to shut out the US or North America or Western countries from being able to use this thing because they're too afraid of getting some enforcement action against them.

[21:41] And one of the difficult things with the regulatory side is that they're doing a sole, selective enforcement thing where they were going after companies of a certain size that they know would settle because they wouldn't want it to get settled by the judge.

[21:54] They would want to make sure that they weren't establishing new precedents. So it was very complicated. I think that, like, like it really, like it really comes down to just building better products that are going to compete.

[22:05] And if you can do that, you have an edge with privacy, right? Like, if you can make something, if you're able to cut all the junk out that you cut by blocking a lot of the noise on the Internet, like, you have an advantage there that Google and all these big companies are going to have to try and work backwards to win back users on.

[22:25] And they're not very good at that at all. I mean, like, and one of the things that's been interesting for us is like, we have a search engine, right? It's a private search engine.

[22:34] Like, it's, it's kind of, you, you can think about it like DuckDuckGo, except we actually own the index. Like the, we have our own independent search index, right? So it's not using Bing or any of these other incumbents.

[22:45] We're. It's our own search index and it's private. And we added these AI answer engine to it too, so that you get these AI answers and the sources are cited and all of that.

[22:55] And it's like kind of a nice, nice display when you're searching for something.

[22:59] Google and Bing are trying to do this, but they keep having all these issues where, like, they're telling people to put screws on the top of their pizza and like all these weird kind of misfires on the AI side.

[23:10] So even the incumbents are not always. Just because you're in big tech doesn't mean that you're going to put out the best thing. I think a lot of it, the agility that you have as a startup, it can be a competitive win.

[23:22] And I think that people are Wanting better tools. They just don't have them in a lot of circumstances or they don't know about it, because that's the other thing. I mean, we're at 80 million users.

[23:32] I would love it if I don't see why we're not at 200 million. But like, if you want to advertise, you know, and get your brand out there to hit that level, you either have to build it from the users or you have to like pay millions of dollars to put your brand out there.

[23:47] Like if you want to accelerate that. So there's challenges that startups have when they're building a new thing where you start to get market fit and then okay, do I have millions of dollars to throw down?

[23:57] Well, not in this environment. Like everyone's kind of sharpening pencils and having to work do more with less. So that's not necessarily a bad thing either. I think that as far as we're concerned, we're seeing just by having a really good search engine with AI answers that work, we're seeing people say, hey, look, brave is just a good alternative to Google and they don't even mention privacy.

[24:19] And I think that that's a really, if we can do that. Like that was one of the most satisfying things I saw because like, it was like we didn't say privacy once, they just liked what we were doing and what we did was just private by default anyway.

[24:31] Like that's where you gotta be. You gotta be in that place where it's just usable. It's as easy as Venmo, but has privacy with it, right? Or it's as easy.

[24:40] And I think that we're getting close, but we're not quite there yet with everything. And I think that it's also kind of tricky to navigate in the privacy space because you have like, like these different cohorts of people that are into privacy, right?

[24:53] Like, privacy isn't an absolute. You don't have like all or nothing but there. It can be difficult in this space sometimes when certain hardcore people in this space that are like the people that are like, okay, I'm totally locked down on everything to the point of where like I'm acting as if a state actor is chasing me down, those people will get like really critical of certain privacy products because they're not, not as private as what they're doing, but they're just not acknowledging the trade offs.

[25:23] And I think that like it's a game of inches where you've got to really like make if you make the right choices with Privacy protection, you can, you can limit the imprint the footprint of somebody's data out there like a lot.

[25:36] And I think that part of this is like there's going to be healthy competition but part of it shouldn't be that it's so savage when it comes to like really cutting down competitors in this space.

[25:46] You know, I think that there's a little bit of growing pains there on the privacy tooling side of things where can get better. But I also, I just think ultimately you build a good product, you get it in front of people and you listen to your users and you're going to see people adopt that product.

[26:01] Debbie Reynolds: Yeah, right. Yeah, I agree with that.

[26:05] I want to talk a little bit about just ad tech in general and privacy and maybe talk about kind of how we got off the rails. Because I remember the early Internet and I remember how as you were saying it was like you had a website, you had a product, you know, people came to your property, they got to see your stuff as opposed to you targeting the person.

[26:25] And we're seeing a lot of problems in privacy and cyber because of the over collection of data and that monetization and the fact that you sort of lose control of that data.

[26:39] And so I like the fact that you're saying by default you're doing things in a more privacy preserving way because it really reduces is the footprint of that person, I guess the stuff that actually gets out about them.

[26:51] But I want your thoughts on that.

[26:52] Luke Mulks: Yeah, no, I mean when I was working in this ad tech space at first it was very much like you go to a site, maybe there's a couple of trackers on the site.

[27:02] They're usually localized to that site. It might be by Google or whatever, but it wasn't the same way it is now where like, or even where it started to progress where all of a sudden like, like if you look like even the stuff that's supposed to be helping privacy on adtech is still run by Google.

[27:19] Right. So you're just saying instead of it hitting five or seven different tags on the client, on the user's webpage, it's going to go through one that routes to all those same ones through Google's cloud service.

[27:31] So Google's still collecting the data, they're just not like, it's just not as broadcast on the user side as it was before. And I think that like, like that's part of the problem here is like you, you've got to look and see of like can we get past the point of having to collect all this stuff.

[27:48] Because here's the thing, like when, right around the time when programmatic advertising started to get really large, like big was also that time when Snowden started doing all that, giving us revelations about what the governments were doing and how the governments were kind of plugging into big tech and, and kind of subverting it that way.

[28:06] And, and, and then you have a mix of governments doing that where they were kind of like, like, you know, sticking a hose in the pipe and collecting the data that the big tech companies were doing.

[28:16] And you have, the same time you have these policies where they could do things, where they could subpoena records or get records with a letter and all that without the user knowing about it.

[28:26] But then what you started to see happening after the Snowden revelations is like this whole idea of business intelligence as a, as a business sector started to pop up and you started seeing data warehouses and you started to see programmatic advertising all kind of kicking up.

[28:39] And the government just started saying, look, like we don't, we got busted. We got, they, they saw us with the, the sad face drawing, you know, where someone showed how they cleverly got through Google.

[28:52] We can't do that anymore. We're just going to go buy the data. Like, and they started buying the data. And so I think that's something that people don't realize is that with this ecosystem the way it is, your data, like governments can go and buy this data or they will set up a company and get the data, you know, as a cutout of a media company or something.

[29:12] Like all this data that you're leaking can get used in ways that you don't anticipate for it to do. So the best way is to kind of limit your footprint.

[29:20] And what we do at Brave is we say basically like a user goes to, like, let's say a user goes to like, I don't know, NBC.com right? Like that domain's owned by NBC.com.

[29:32] the request you get from that page should be for NBC.com or a set of, you know, framework requests for loading a video player or doing other things. Right. Like, but third party calls to ad tech that are going to collect your data are not something that the user is aware of or that they should have to agree to.

[29:51] Like the web shouldn't work that way. Right. Like it should protect the user's privacy by default. So all those third party calls we're going to block and if, if the websites can get it together and can build it through a first party call, that's fine.

[30:04] We'll do that, we support that. But we're going to take this stand because in a user first environment, the user is going to be the one saying yes or no to things and you shouldn't collect their data by default.

[30:15] It's like that simple. Or let third parties do it. And so we're not against users giving data to companies. Like, we're for it if the user knows what they're doing.

[30:24] Like, you can get, like we run ad campaigns with Brave where like an advertiser can, or a brand can get an email address from somebody. Like the user's got to give it to them though.

[30:35] And once they give it to them, they can have that relationship. But you want to make sure that the relationships are the ones, the users are the ones that are driving those relationships.

[30:42] Right? Like, and that's, that's really where this dynamic shift happens. And I think that AdTech has completely built a system that works in the opposite way where, you know, they've somehow kind of psyoped the world into thinking that in order to have a free web, you have to surrender your privacy, which is a joke.

[31:01] Like, I mean, you can't. And then they've kind of made this whole dynamic of like, okay, if you're not doing anything wrong then then there's nothing to hide, which is just not true.

[31:11] Like, you can't. Like, people need to realize like, you're not, you don't have freedom unless you have privacy because you, you have to. Like, if you, someone's telling you you don't have privacy, then you're not a free person.

[31:23] Like, you have to have privacy in order to create, in order to be able to work without self centering and with being able to like create the future you want to create for yourself and with your family and friends and everybody that you want to give your life to.

[31:36] Like, no one wants to give their life away to these companies. And what the companies are giving back is not a reasonable service that's worth you surrendering all of this.

[31:44] You shouldn't surrender it. Like, we have like protections that used to be well understood and have kind of fallen by the wayside. And I think that the only way to really combat that is by building tools that respect the user and can make it compete.

[31:58] And I don't think governments are going to save us. I don't think that, you know, it's going to come down to people that feel the same passion building the tools that will give people the freedom that they've been losing.

[32:10] Debbie Reynolds: You know, basically, I agree with that a hundred percent.

[32:14] Luke Mulks: Right?

[32:15] Debbie Reynolds: Because I think a lot of times I have lots of friends who, I don't work in regulations, so I have friends who work in regulation. Nothing wrong with regulation. But I feel like some people think like if we can just get regulations like Mount Everest and we get to that peak, then everything else like solves itself.

[32:31] And I'm like, well that's only part of the puzzle. It's like a lot more. So we have to think about what that means and those other things that we have to do.

[32:39] So I think that's really important.

[32:41] Luke Mulks: And you can have all the regulations in the world, but if companies aren't going to respect those because the regulators aren't going to enforce it, then okay, great, we passed this regulation but like it's meaningless if they don't do anything about it.

[32:55] And I think that's been a problem is that you get a couple cases here or there where they make a headline with a big fine, but like really nothing changes.

[33:04] And I think that you know, a lot of times too, the remedies aren't really that great. Like, you know, there's been a double edged sword with this, like with, with GDPR in that like people have really grown to hate cooking and sense dialogue as a result, you know, and if you go to Europe and you browse the web, it's just, it's bad.

[33:21] I mean like so in one side of it it, it's great that you aware of these things and that you've got pieces of this where you know, the right to be forgotten is fantastic.

[33:29] Like there's a lot of good things that the GDPR brought but like a cookie consent dialogue is not the answer to the problem. Like it really kind of just dances around the fact that like we have to re architect the way that these relationships are with businesses and people and, and, and really kind of put more respect to the use.

[33:47] So I, I mean, I mean we're starting to see things move in that direction. I'm, I'm, I'm excited about that. I think that, you know, there's also this stuff with the antitrust right now that's been really interesting and we've seen some things happen in Europe especially that have come out around these, these remedies and rulings where these companies are forced to bring more choice to the market.

[34:10] And when we, when that happens we see big uplifts. I think we saw a 30% increase in some of these countries by presenting the user with a choice and it really kind of shows you how Much of the market's been bought by these companies in one way or another.

[34:24] Debbie Reynolds: Wow. Well, you hit, you hit on one thing that I would love to talk about.

[34:28] Luke Mulks: Yeah.

[34:28] Debbie Reynolds: One of my favorite topics that I really get to chat about and that is the relationship between privacy and competition or antitrust.

[34:38] Luke Mulks: Yeah.

[34:39] Debbie Reynolds: I was interviewed for Bloomberg many years ago and I was doing this article with the antitrust lawyer and I was telling him about the relationship between privacy, antitrust. He literally didn't believe.

[34:53] I was like, it is connected. So I want to know your thought. You touched on it a bit, but I want you to, want you to kind of talk through how it is related.

[35:05] Luke Mulks: Yeah, I mean it is really, it's systematic and it's if a company knows all the data, right. Like, and you know, everybody assumes everybody operates in good faith a little too much.

[35:17] I think, you know, it's one of the big things that really was eye opening to a lot of people is just how loose a lot of these companies are with the data.

[35:25] Where I used to get feedback from people, people used to like say, oh yeah, Google is not going to abuse the login on the browser. Like that's a separate thing from advertising.

[35:35] And then, oh, here, here comes the court documents that say, nope, they've been doing it this way for years. And yeah, like, it's not what you thought it. And I'm never going to be the person to say I told you so, but like, look like the profit motive is motivational, right.

[35:49] Like people are going to move with that.

[35:51] The real problem though is that what you start to see with this antitrust stuff is that it's not about there being one monopoly. It's about some of these organizations are multiple monopolies within one umbrella.

[36:05] So like if you look at somebody like Google, they have the Android operating system and a surprising amount of people, you learn this working at a browser company. Like a surprising amount of everyday people do not realize the difference between the browser and the search engine on their phone because the search engines right there on the home screen, they get this confused all the time to the point where we end up getting a search engine right.

[36:29] Like it clears that up. But I think, you know, people don't realize like how much people see these things as one and the same. But what you started to see is like okay, Google has that mobile operating system, they have the search engine right where they're, you know, a huge majority of searches are happening on Google and that's a business where it's very hard to come as a startup.

[36:50] And do something. You have to have a really, a new way of indexing that no one else has done before in order to catch up. Because if you do it the same way that these big companies are, they're, you're never going to compete.

[37:00] They have way too big. That's why you also see that there is Bing and there's Google and there's us because all these other search engines are using Bing or Google like in one way or another to syndicate the results, which is fine.

[37:14] They can do that. And a lot of them are using us now too, because we've started to make that available. But okay, so Google has search, they have, they have the operating system, they have the browser, right?

[37:25] Like majority of people are using Chrome, all of those things. And then the ad tech ecosystem, they own so much of that or they own the arena that that's in.

[37:34] It cannot be objective when you own that much of the real estate everywhere, right? Like you're going to, you're going to see what we're seeing happening now where like the browser can't really operate with the ad tech ecosystem the way it was before.

[37:47] But then you see other weird deals like, so what they would do is with Google, did with Mozilla, with Firefox, is they made a default search deal with Firefox where, you know, I think over 80% of Mozilla's fire.

[38:01] Mozilla's revenue was from that Firefox Google search deal that they had. So they default to Google Search and Mozilla Firefox makes revenue when people click on the ad basically through Google search results.

[38:13] And so it's kind of like we're not going to own everything, but we're going to own it in one way or another. Right? And they did the same thing with Apple too, with Safari.

[38:23] And when they can pay those huge amounts of money for those distribution deals, they're basically buying out any room for competitors to compete in a meaningful way. Way. And so what you start to see happening is that like they have total ownership over the real estate and then over the, the influence in that real estate because they have the data, they know what the users are going to do.

[38:46] They have the, they own the, they own the house, they own the foundation for the house, they own the pipes, they own the flow of water through the pipes. It starts to look like everything.

[38:55] And I think that what you're starting to see or what we're seeing at least like is when you open up the exposure and show people that there are other options, people start to gravitate to those options.

[39:07] And I think that's what's been scary to these incumbents is that they haven't had to deal with this before in the same way. Like, if we can go in and say, look, you can have the equivalent of a YouTube Premium subscription for free just by using our browser, people move toward that because people are getting hosed.

[39:27] Like, okay, how often are these things going up in price while your data is getting, know, fire hosed out to everybody? Right? Like, like, using YouTube is just really difficult these days because it's just so many ads, they're just putting their finger on the scale too much.

[39:42] And, and the longer they do that, the more people are going to just be reluctant or ready to move to something else. And I think that's where we're at now.

[39:50] Like, we're at that place now where people want to move. And, and it's like they're so starved for other options that people just are so blinded to how bad things are that like, I used to tell people this.

[40:02] I would say, like, go well, try out Brave for a week and then go back to Chrome for another week and see what you're missing. Like, you don't realize how bad it's gotten because you're just so accustomed to how bad it is.

[40:13] Like, why do these sites take us along to lower, like, it should be faster than this. The technology is better. It's just that these systems are allowed to kind of keep doing what they're doing.

[40:21] There's no incentive for them to stop. And so if they're not going to build, build room for incentives to come in, then competition is just going to come in and clean house.

[40:30] And that's what we're trying to do, you know, with what we're doing is just to kind of show the market market that there are better options that you can use.

[40:38] Debbie Reynolds: Absolutely. That's amazing. Thank you for sharing that.

[40:42] I want you to clear something up. And this, you know, I've done a video about this, I've talked about this several times, but. And it's about global privacy control. So we see some states in the US that have passed regulations saying that people should, the companies should be using global privacy control.

[40:59] And I guess the frustration is people don't. First of all, global privacy control isn't global. Right. So when a normal person hears that, that's what they think it means, but it doesn't.

[41:08] So why would you tell me what is global privacy control and what it isn't?

[41:14] Luke Mulks: Yeah, it's kind of, it's like some of these other efforts, like there was a do not track effort where users would say okay, if you opt into this then we're going to send a flag that says okay, do not track this user.

[41:25] And you kind of are hoping that companies respect that. Global Privacy control is a similar thing. It's a newer version where basically a bunch of people are going to send a global privacy control flag with requests.

[41:37] And if companies are observing it, then they will respect the global privacy control and not track the user in the same way.

[41:43] The problem with these things is that you're still relying on actors that are monetizing a data to behave well. And it's not even a, sometimes it's not even that they're being malicious, it's just that they don't know know.

[41:56] Like I think that's part of, from coming from the ad tech space. Like it's not that people are knowingly like abusive of users privacy, it's just that they don't know that what this stuff does, even if they're working in it, right?

[42:12] Where you're like, if you're a publisher and you're not forced to have to learn something, like they're probably not going to respect the thing even if it's there. Like it's unfortunate but like you're looking at organizations that have had to reduce staff, that have had to like do more with less.

[42:30] They don't have time to learn all of these things necessarily. Some will and that's great, but it's not getting to the root of the problem in the same way. And I think that you know, these efforts are nice to haves, but they're not going to solve the problem, right?

[42:46] Solving the problem is building a better tool like is going to solve the problem. These other things are great when people respect them, but just like you said, they're not global.

[42:56] It's fancy language, right? Like it sounds bigger than it is. And you know, if we're having to have state by state, the, the, the Internet doesn't work that way, right?

[43:05] Like there's ways you can restrict stuff but like okay, like again it's, it's not as meaningful as it could be if you just had something that was just gonna build it to where it doesn't have to.

[43:17] Why do I have to inform somebody? Just don't steal my stuff like that simple, right? Like, but like again I mean like there hasn't been, there haven't been people building privacy preserving ways of doing things and that's been the problem.

[43:31] Like is that You've got to have somebody that's going to be honest about what privacy means, like in around those first principles and you're going to have a really hard time doing that with people that have said that this is not a problem as a first principle, like forever.

[43:46] And they've made money, you're hand over fist off of that. It's a much harder thing for them to do. That's why, like, I mean, like, it's a harder job for us.

[43:55] I mean, I can't tell you how much money we've turned away by not taking shortcuts with this stuff because you know, companies will say, hey, just whitelist this thing for me or you know, allow this exception here and we won't.

[44:07] And that costs us a lot of money. But like you're not, we've got to build it the right way with people and we'll take part in some of these initiatives.

[44:15] We, we've taken part. We, we send a global privacy control flag out. But we also go farther than that. Like we just build better protection in by default too on top of that.

[44:25] Because you can't really rely on people behaving the best in this world. Like, because most of the time they won't, you know, unfortunately, if there's, the incentives aren't aligned in the right direction.

[44:37] They're going to go do press events and stuff, but they're not going to, you know, the rubber is not going to meet the road the way you hope it would, unfortunately.

[44:43] Debbie Reynolds: Yeah, right. I agree with you wholeheartedly that I think this is something that has to happen on an architectural foundational level. And this reminds me, me maybe a, a tad off topic, I'm not sure but this is so funny.

[44:59] So the, it reminds me of robot robots txt.

[45:03] So when I read like some of these big companies privacy policies and they say they respect robots txt, like I literally like spit out my tea. It's like, what? Oh my God.

[45:14] I call it, I call it the fig leaf of the Internet. Right? So exactly at the dawn of the Internet we were like trying to be more, I guess as you know, courteous and doing these pinky swear things.

[45:25] Right? But I have to tell people, I'm like, robots txt doesn't stop you stop anyone from taking your data. It just tells them that you don't want them to do it.

[45:34] Right?

[45:35] Luke Mulks: So I'm going to steal that, the pinky swears. I think that's a really good way of putting it because basically what it is, right? You're like, okay, I'm going to tell the web that this is not going to be.

[45:45] Or the no index, right, the no index tag that people would put on it like on all their stuff. And but I don't think that's off topic either by the way because it's becoming a, a bigger thing now with, with AI, right where you've got these sites that are some of the same things that were happening with privacy are now happening with content with AI where the publishers are really at risk with that one because you know, like you're starting to see some of these valuations like for perplexity and some of these guys that are out there in crazy insane valuations and you wonder if, if how much of this fundraising as they're doing is going to be around some of the litigation because you know they're going to have to make deals with publishers because citing sources and things like that is like really important.

[46:26] And publishers, that's their living is like it's not even about advertising at that point. It's like if their content can't be there. And on the flip side of that like you're just going to get publishers creating content with AI too and it's going to be something where the smell test is going to get there where people, people recognize it now even I think like you'll see people say oh, that's just a chat cbt answer right to something where people were really afraid of oh, everything's gonna be bots.

[46:52] And the humans are really savvy with this stuff. Like they know when something, they get a sense of when something's real or not after it's been out there for long enough.

[47:00] But I think that, that yeah, the pinky swear thing, it's real. Like and, and unfortunately like they do, they've broken their promise so many times that you can't, you can't rely on that as your, your safeguard anymore.

[47:15] And well intentioned beginnings are well intentioned beginnings. But you have to have you know, some better measure in place to, to make sure that you're not like doing that. And, and there are like, we do a good job of like breaking the third party stuff, but there are like people that want even stronger privacy protection than what we give and, and users should be able to put that on too.

[47:39] Like I think, you know, it's one of those things where there's different comfort levels that people have and, and you really gotta try to take the biggest win possible for a reasonable expectation.

[47:50] Right? Like I go to Google, even on Google, right, I go to google.com and make a search. I don't care if I see a Google search ad because it's on google.com

[47:59] I know that I'm getting an ad from them there. Like, I care if I go to all these sites where other third parties are serving me stuff based off of my information, but if it's something I'm putting in there, like, but other people don't want that.

[48:11] They don't want to see any ads at all. And like, okay, cool. Like, that's fine. Can you make the advertising experience respect the user and something that users are fine with or even find helpful, like, and that's one of the things we've tried to do at Brave is that, like, okay, we can make an ad ecosystem even that respects users that's, you know, not only private, but also isn't the same ad system that they're used to.

[48:38] That's so annoying that they want to block, block. If we have a browser that blocks ads and then we can get users to opt into ads that, you know, or give users the option to turn them certain ones off if they want.

[48:49] Like, you're giving the users choice. That's user first. You're making an ecosystem where brands can do stuff tastefully in a way that is helpful for users. And that's ultimately, like, advertising isn't a bad thing.

[49:04] Like, from our point of view at least. I think a lot of people learn about things from advertising or, you know, there's always this kind of stylish thing about advertising where, you know, you want to know what the brands are doing and get aware of things that are happening.

[49:17] It's just gotten so abused that it's not even really advertising anymore. Like, it's, it's something else. It's like some whole other surveillance, weird thing now. So I think that, yeah, like, a lot of it's just like, respect the user.

[49:30] Like, really. And if you can do that, then, you know, it's a new world with these things, like, where, okay, like, we can have a tool that can respect the user.

[49:41] Users now are empowered to be. To say, no, I'm not gonna, I'm not gonna play in this old system anymore. And I think businesses are starting to get wise to that too.

[49:50] And I think AI has been a thing where, look, it's not even about user privacy as much as like, you want to have all your people using AI and putting your company's sensitive information into this thing, like, where you don't know where it's going to go.

[50:03] Like, a lot of it Is like that level of kind of operational security where you want people to. Same thing as, like, telling people not to click on phishing links, right?

[50:12] Like, you want to make sure you're telling your staff, like, not to put sensitive information into some of these, you know, LLMs, right? Like, are these. Some of these chatbots that go to Kunos, where.

[50:22] Or if we're doing, like, what we're doing where we have like a SSN in the browser, you're making sure that you're protecting the user's privacy on everything that they're inputting and that they're not, you know, we're not collecting any of the data like ourselves either.

[50:32] Like, so there's ways that, you know, for the first principles, again, that you can apply to AI in the same way that you're doing to finance or just browsing.

[50:41] Debbie Reynolds: Excellent. Excellent. I didn't even have to ask you about AI. And it came up.

[50:46] Luke Mulks: It always just kind of comes up.

[50:49] Debbie Reynolds: Well, if it were the world according to you, Luke, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior or technology?

[50:59] Technology.

[51:00] Luke Mulks: Oh, gosh. Well, selfishly, I just want everyone to use Brave. I think that's a good wish.

[51:06] Debbie Reynolds: That's a good wish.

[51:06] Luke Mulks: That's a good wish. I think a lot of that, like, I, I would really, really shake up how much influence companies have over regulation, like, or. Or lobbying. I think a lot of this, a lot of the.

[51:22] And this probably deviates off topic, but whatever, I'll go there anyway. Whether you're talking about things like election reform or, or privacy or AI or whatever, money is a corrupting factor with these things.

[51:35] If you can put better rules around the money, you don't have to overhaul a system. Like, you can upgrade a system by adjusting. Well, okay, like, for example, if you're running for Congress in a district, right?

[51:49] Like, well, that person you're running against could get money from donors from all other states and even outside the country in some cases. Like, how is that not working towards the voters in that district?

[52:01] Right. That's working against them. Right. Like, you don't need to overhaul the whole system. You just need to say, okay, you're going to run in that district, we're going to raise money from that district and not accept money outside that district.

[52:11] Right? Like, then you're compelled to. You're motivated to govern for your rep. The people you're representing versus, like, outside influences. Right. And I think that privacy, it's the same rule there, like, if you have the same people, this revolving door between industry and in government that we have, and it applies to everything from like, you know, defense to the privacy to all these things where.

[52:35] Or even antitrust, Right. Like where you've got, okay, company executives, then go become the regulator and then go back to the companies. Right. Like you're not going to see real progress there if you can't make the get the money situation under control and really kind of regulate how the financial incentives impact or influence the policies that are happening that govern or regulate the technology.

[52:59] So I think a lot of this is like, needs to focus on fixing the money flows for these things and making the incentives align with people, whether they're users or citizens or whatever.

[53:10] Right. I think that that's where a lot of the corruption happens is with the financial incentives. And so if I had one wish, it would be to shake that up, shake up the financial incentives around the folks that are influencing policy and regulation.

[53:26] Debbie Reynolds: That's certainly a good wish. That's certainly a good wish.

[53:31] Actually, I saw an article recently, it was like an op ed and it was in a Connecticut newspaper and it was supposedly, I'm doing air quotes supposedly of a Connecticut small business owner.

[53:43] And they were given all the reasons why they were against like privacy regulation. And you could tell it was like a veiled letter from like a data broker because some of the stuff they were talking about was like, oh, well, you know, in these couple of states they're doing this.

[53:57] I'm like, typical small business owners probably aren't. Can't tell you how many states have privacy law, you know what I'm saying?

[54:04] Luke Mulks: Right.

[54:05] Debbie Reynolds: So you could tell that it was like a lobbyist wink, wink, that had written this thing. That's problematic because you're basically trying to influence someone against something that's really in their best interest.

[54:16] Right, Right.

[54:18] Luke Mulks: Yep, totally. These games happen all over the place. They happen on ballots, measures, they happen in the press. And I mean, there was a point too where I think it was this guy at the Internet Advertising Bureau, the president of IT at one point was trying to say that having an ad blocker is the same as like, you know, robbing a store or something like that, where they trying to try to equate that as criminal behavior.

[54:40] And it's like, no, it's not. There's precedent saying users can control their software. So no, you know, but these things get really wacky or skewed. And yeah, I have a hard time believing that small business owners, Small business owners are focused on their survival Right.

[54:58] Like, if you look at the odds, what, nine out of 10 startups fail in the first year. So they're much more concerned with making a viable product or service for the market than they are about every nuanced thing with regulation, right?

[55:10] That's. You have to make the regulation work with the technology at the, at the fundamental level. Because it's the same thing with, with you. I'm a parent, I have two kids.

[55:22] Like, there's a busy schedule in the day. Like, I only have so much bandwidth to process things. Right? Like, that's why, like, you have to really build this in. At the tooling at that level.

[55:32] You can't. Everybody, everyone hopes that they're not getting taken advantage of. But what you want to do is deliver products that eliminate the need to hope for it, build the products that you want to see in the world.

[55:43] And I think that non people have done that. I also think that it was harder to do it before than it is now. I think that now there's so much great open source software and there's so many ways that you can, with social media and other things, get the word out about what you're doing that it's never been quite as turnkey to do things now as it is it was in the past.

[56:03] It was a lot harder. So I think we're at an interesting point where people almost have power tools now where they can go and kind of, you know, choose their own destiny and if they want something bad enough, they can work it.

[56:15] I mean, you know, and that's awesome. But, you know, you got to have people building things that aren't letting the tools take advantage of the people that are using them too.

[56:23] Debbie Reynolds: Yeah, I agree with that wholeheartedly. Well, thank you so much, Luke, for being on the show. I really appreciate it. This is amazing.

[56:31] Luke Mulks: Yeah, yeah, thanks for having me on. And yeah, if folks want to.

[56:36] If folks want to learn more about this stuff, we have a Brave Technologist podcast today. Host. If you go to brave.com

[56:42] podcast, you can see the episodes there or just check out our stuff on brave.com and I'm just Luke Mulks on Twitter, if anybody wants to jump on there and say hello, my DMs are open.

[56:52] Debbie Reynolds: Excellent. Thank you so much. I really appreciate it and I look forward to us being able to collaborate in the future.

[56:57] Luke Mulks: Yeah, yeah, thanks, Debbie. Really appreciate it.

[57:00] Debbie Reynolds: Excellent. Talk to you soon.

[57:02] Luke Mulks: All right.