"The Data Diva" Talks Privacy Podcast

The Data Diva E113 - David Heinemeier Hansson and Debbie Reynolds

January 03, 2023 Season 3 Episode 113
"The Data Diva" Talks Privacy Podcast
The Data Diva E113 - David Heinemeier Hansson and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds “The Data Diva” talks to David Heinemeier Hansson, Co-owner & Chief Technology Officer of 37signals (Makers of Basecamp + HEY)We discuss the EU's proposed data transfer framework, which is still in draft form. We talk about how the framework will impact businesses and how privacy is a data issue, not a legal issue. David Heinemeier Hansson is a Danish entrepreneur who has been working with the internet since the mid-90s. He is the co-owner of an American software company, and he is currently in Denmark. In 2013, Edward Snowden revealed the NSA's tapping into global internet cables and wholesale data collection. Heinemeier Hansson was horrified by this and wondered what the European reaction would be. The EU has traditionally had a stronger stance on online privacy than the US, and Heinemeier Hansson wanted to see if that would continue in light of this new information. In the wake of the European Court of Justice's ruling that the Privacy Shield agreement between America and Europe was invalid, privacy activists have been scrambling to figure out how to reform the agreement. 


We discuss his journey into technology and privacy and his strong stance on the challenges that businesses face trying to do transatlantic data transfers, and businesses were left scrambling by changes in privacy regulations between Safe Harbor and Privacy Shield, legal regulations and technology, the responsibility of regulators and businesses, privacy via encryption, small companies, and data risk, reconciling US and European privacy issues in the short term, attempts to provide clarity to business about data transfers, the expense associated with compliance, and his hope for Data Privacy in the future.



Support the show

Debbie Reynolds
During this show, we talk extensively about EU-US data transfers on December 13th, 2022. This is since the podcast has been recorded. The EU has released the actual draft decision, which will go to the EU court. So once approved, it will form the basis of a final framework that will go into effect in 2023. At the time we recorded this podcast, there was no text. So now we have text that people can mull over until this framework is finally adopted by the US and the EU. Enjoy the show.

36:49

SUMMARY KEYWORDS

privacy, data, companies, people, privacy shield, europe, court, services, trade, eu, principles, deal, european, encryption, conclusions, intelligence services, email, spy, american, thought

SPEAKERS

Debbie Reynolds, David Heinemeier Hansson


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. Our special guest on the show is David Hienemeier Hansson. He is the Co-Owner and CTO of 37signals, which is the maker of Basecamp and Hey. Nice to meet you.


David Heinemeier Hansson  00:42

Thanks for having me on the show.


Debbie Reynolds  00:43

Yeah. I mean, I'm very excited to have you on the show, I received a news feed article that you had written about Europe and data transfers and it sort of came through my phone like a lightning bolt, okay. Because I have conversations with people all over the world around data and stuff like that. And so things people say in private and they say in public are totally different. And I feel like you just really just kind of opened the curtain on that, in terms of, you know, not only someone who was in Europe but also a business person that is impacted by all these different regulations or machinations and what people think privacy should be and how to solve this problem. You know the thing I've always said is that privacy is a data issue. It's not a legal issue. So trying to shoehorn it into a legal thing doesn't really solve the problem. But before we get started, I would love for you to tell me kind of your journey in technology and why privacy has become an issue or why it gets such an I guess visceral reaction from you around this privacy topic.


David Heinemeier Hansson  02:06

Yes. So I've been working with the Internet and on the Internet since the early to mid-90s. So the functions of the Internet, the privacy on the Internet, all the factors of the Internet are something I care a lot about. I also happen to be a Co-Owner of a technology company that makes SAAS software that is downstream from all of these regulations and questions about privacy, and deliberations about how we should do it. Now on top of that, I happen to be a Dane who's currently in Denmark, co-owning an American company. So I kind of feel like I have a foot on each continent, and therefore also an opinion about how these continents collaborate or not. And when this whole thing got going, which really to wind the clock back was 2013, Edward Snowden reveals the NSA collaboration, the tapping into the cables and the wholesale collection of data from anyone around the world. That was something that was I mean, first of all, fascinating. Second of all, perhaps horrifying to me as a person on the Internet to know that this was essentially what was going on. And I remember thinking at the time; I wonder what the Europeans think about this; I wonder what they're going to do about this because here is a sort of sovereign bloc that can make its own rules about how it wants to cooperate with partners and other countries really when it comes to these issues. And the EU had long taken a much harder stance on privacy matters online than the US had. And now he was really a case that was going to put that to the test. Well, it just so happened that that test would just take years and years and years to roll out and would work its way through the courts, the European Court of Justice. Max Schrems, who was the privacy activist who brought these cases before the court. It just rolled on and rolled on and rolled on. And then suddenly, we had this news that the European Court of Justice says like, you know what this Privacy Shield, this agreement that America and Europe have for exchanging data is actually void. It is not based on something that's valid; you have to figure something new out. And I remember thinking time wow, that is such an earthquake. Like, what does this actually mean? Like how can we even reform this? Are the Europeans ever going to tell the Americans what the NSA can and can't do? Like can the Americans ever listen to something like that? That sounds interesting. It sounds like a real battle. Now, when when I first learned about that, I was actually surprised about how little happened. You have this tectonic earthquake ruling. That is basically from one day to the next thing like the where you've been operating all along is illegal. But then, in perhaps typical European fashion, just leave the conclusions unended, un wrapped up. It's like you're watching your favorite movie, and you're like, you're going to explain everything right. And then the final scene just comes before any of the conclusions are like, what happened to that guy? What happened to that woman? What happened to the plot? Right? And the European Court of Justice is like, well, I don't know. We just said it was illegal. It's your problem. All right, we've been wrestling with that problem ever since.


Debbie Reynolds  05:34

So this is interesting. I love to talk to people who have been dealing with privacy issues since before GDPR, like you and me. So when that happened, I was also just shocked. I couldn't believe that agreement got knocked down, right? But you're right, so they sort of left businesses to hold the bag, you're like, well, I don't know what you're going to do, like within, you know, the month and I couldn't believe that it went on, I think like three or four months before Privacy Shield came about and like so literally businesses were doing business one day and just have to stop what they were doing, and try to figure some something out and sort of scramble and figure out what they want us to do. I think also one other thing that happened as a result of that, and I think we're still seeing the effects of this is that at the time that the Safe Harbor was invalidated, I think they had almost 11,000 companies sign up. But once the Privacy Shield a couple of months later came on board, I think only like half of those people signed up again, right? Because they're like, this was just bananas. You know, we still have to do business, we still have to do trade, but then we just don't have any guidance really about what needs to come next. So tell me about what did companies do during that little period. Like a few, I think it was like three months or something where there was no agreement or anything?


David Heinemeier Hansson  07:00

Well, I don't think they did anything. Well, the vast majority of them didn't do anything. I'm sure there were absolutely countless meetings held. I'm sure there were a lot of lawyers running up some monster bills to the largest corporations, but everyone else, the vast, vast majority just didn't do anything. Because what were they going to do? Right? Like there was no guidance, as you say, there was no clear what does this mean, no one in power or authority was willing to draw conclusions from that. And really, to be frank, that wasn't just those few months between Safe Harbor and Privacy Shield. It was all the time afterward because even during Privacy Shield wasn't entirely clear what this meant; it wasn't entirely clear how it related to the verdicts that had come down. And everyone was just I think looking around, like, what are you doing? What are you doing? Can we just do whatever you're doing? I think there was just such a level of confusion yet, in a way of a status quo confusion, right? Like if we're all confused, but we're trying; presumably nothing will happen, which, to the best of my knowledge, nothing really did happen. during that interim, even though they were obviously not just the 1000s of companies who didn't make the transition from Safe Harbor to the Privacy Shield, but everyone else who technically weren't in compliance, like what, then what, nothing, absolutely nothing, right? Like when the legislation is so vague, and the enforcement is so absent, of course, people are just going to go like I don't even know, let's just wait and see. Except, of course, for the few companies that Max Schrems and others took to court under these protocols and basically said, like, hey, that's not in compliance. We want to get this invalidated. And I think the main thing was, was with Facebook Schrems taking that course off, and I think everyone was right, in essence, to do nothing. That's the thing. That's the strategy of this when there is no clarity on what does this mean? Or how is it going to be enforced? How can we plan anything? How can businesses take accord of it? Are we going to let individual businesses of all sizes interpret the tea leaves by themselves? Well, how does that make sense? How is that a good use of anyone's time? It just seems like it was just such a dark hole for again, bullshit work, in essence, right, like a bunch of stuff going back and forth. Standard contractual clauses that meant nothing, that weren't enforceable and that were never going to be enforced. Just this busy work where we all pretend, hey, we have to be in compliance because technically, the penalties are quite severe or can be severe, even though no one has, to the best of my knowledge actually suffered any of those penalties ever, even the likes of Facebook and so on, on these two specific provisions. So I just looked into it right. So I run a software company we have these two SAAS products we sell in Europe. I am very sympathetic to the underlying mechanics here. I thought the Snowden revelations were horrifying. I thought the EU was right to say, hey, we're not going to stand for this. We're not going to have our citizens be spied upon in this way. We're going to act upon it. So I'm always sympathetic to what's going on here. And then I still go like, yeah, but what do you want me to do? What do you want us to do? Be clear, say it one way or the other, either, there's going to be an iron curtain here of privacy that comes down and like, okay, trade in this way, we'll stop. That's a very draconian outcome that I think very few even European politicians have any appetite for. Why are we going to provoke a trade war here? What's going to go on, but at the same time, they're obviously beholden to what the law of the land says, the law of the union says. And when the European Court of Justice comes down and says, like, hey, you can't do this. This is not in compliance. With our setup here, you have to do something. But no one had the courage, and no one had the stomach to tackle the issue head-on. So we just spent years and years in this middling, muddling phase until again, the court comes in and says, like, no, you're still running blind, right?


Debbie Reynolds  11:20

So two things I want your thoughts on. I'll give my thoughts. So there are two striking things about all these Schrems cases and what's happening as a result of them. One for me is that although, you know, it's definitely an interesting bit of kind of legal maneuvering and legal thinking it's very untethered from the reality of technology, what how actual data works. And then the other thing is, you know, you as a company, you can't stop, you know, the FBI from busting into your organization, right? So, so it sort of puts the onus of trying to prevent that on a business owner. And that to me, I feel like that should be something that should be handled between the governments, not a business person; what are your thoughts?


David Heinemeier Hansson  12:16

I completely agree. First of all, this confusion around the data is one of the things that actually prompted me to write this in the first place; we would get outreach from European companies saying, Are your servers in Europe where you're storing your data? Can you store your data in Europe, and I would try to explain over and over again, listen, it doesn't matter where the servers are, whether they're in Virginia or in Berlin, the legal authorities in the US will have access to that data, especially for applications accessible over the Internet. It doesn't matter where it physically is. In fact, I couldn't tell you exactly where the data is. Within the US, for the services we use, we use AWS S3 service, and it has this region setup where you have redundancy. And so like the physical pinpoint accuracy of where that data is, I couldn't even tell you; what I can tell you is that if US authorities show up with the legal basis for getting any piece of data from us, as an American company, we are obliged to do that. There is not another way. So if you're dealing with an American company, they are in the jurisdiction of American courts and American law. And as such, will turn over any amount of data regardless of where it is stored. So it just frustrated me that there was this shallow understanding that if just the company would have the servers in Europe, all our problems would be solved because then there wouldn't be this data being shot back and forth. And we like, it doesn't matter. We looked into this exactly. When all this came out, we had our legal team look into can we put the data in Europe. And they're like, yeah, you can but it won't matter. And then we were like, can we wrap the data that is in Europe in a subsidiary, that is a European company, yet, that won't matter either because it's still a wholly owned subsidiary, the US government can still order the holding company to get that data. So all the protections we looked into were essentially sort of a dead end; you cannot protect under the guidance of what the European Court of Justice is talking about. That data is liable to American intelligence services. That's a problem. You can't protect that as an American company. But that was what really pickled me. These European companies, obviously, were not interested in drawing that obvious conclusion. They were not interested in saying we can't buy any services or products from American companies that store that data wherever it might be stored. We think we can't buy services or software from American companies; that would just be a catastrophic conclusion. What they're going to give up their Google Docs, they're going to give up their Microsoft email; they're going to give up all these other things that they have, all their Apple cloud services. So it was much easier for them to simply go like put their head in the sand like an ostrich and just say, I won't deal with the problem. I will just choose to interpret the problem as one of is your data physically in the EU because plenty of these large companies will make proclamations to that effect. No, no, Microsoft, the Outlook email you're storing is actually in Dublin or whatever. And then I go, okay, that's enough. I don't want to know anything more than stick their hands in their ears and they go la, la, la, la, la, la can't hear anything else. Now we're good right now; this is not something I have to worry about again. So that part, that confusion that where the data physically is located, that means anything in any practical or even philosophical sense, was just nonsense. And I wanted to shoot that down not for our particular purpose but for all the European companies that were wasting their time requesting this information or guarantees from their American vendors.


Debbie Reynolds  16:09

Yeah, yeah. Also, I want your thoughts on encryption. So this train of thought just grieved me greatly when I heard people say this, so they were like, okay, the way we're going to solve this issue is that we're going to do encryption, and then we're going to keep the keys in Europe, and then throw the data to the US. And I'm like, that's not a solution. Well, what are your thoughts?


David Heinemeier Hansson  16:34

It's not a solution for a ton of things. There is a limited number of cases where like, technically it is the solution. If you can encrypt the data and you're doing the encryption and you're holding the key, and you're the only one holding the key, then yes, you can technically put that data wherever you want, as long as that data is sort of all-inclusive, as long as there's not metadata that actually reveals connections between the data, which is often the problem. But that doesn't help you use any services from Google Docs. It doesn't help you use Microsoft Outlook; it doesn't help you use Apple's iCloud services because none of those services operate in a way that end-to-end 100% guarantees you that as long as you hold the keys, the data can be accessed. That's just not how any of these things work. Because it's very difficult to make things work like that there is a tiny handful of platforms, including things like Signal that truly does operate in that way. It takes an incredible amount of diligence to get there, and a very hardline way of staying there. You can't, for example, offer search. If you're encrypting all the data that you have and you're accessing that through a web application or something else, you can't allow that data to be searched. Because now the application has to know what's in the data to index it such that you can be able to search it. Very few people are willing to give things up like search. So in theory, yes, if you held the keys, and you were the only one holding the keys, and the data was in its own little box not connected to anything and wasn't being used in connection with any services, you could do it. But no one works like that. No one operates like that. And ergo, it's another smokescreen for the fact that you can't do this protection; it just doesn't work. In fact, even a lot of forms of encryption, like encryption over the Internet, the NSA, the revelation in 2013 revealed that yes, if they just tap in at a deeper level, like say, have a computer in the AT&T data center directly before things are encrypted or decrypted. It also doesn't matter. So these are sticky, difficult, hard questions. But that's why we have governments; that's why we have courts. You're supposed to adjudicate on these issues; you're not supposed to just kick the ball to every individual company to do this sort of root cause analysis of is this actually secure enough, is it not secure enough? And it just frustrates me to no end that if you want to have these principles, which by the way, I have the utmost respect for, I don't know what the economic consequences would be if the EU went hardline on this, they would probably be quite severe. But if you have these principles, you also have to live in other ways. You've got to throw out the principles. Otherwise, you've just got to come clean and say, okay, this is just too hard. It was, it would be too disruptive. We don't have the power to force American intelligence services to abide by our standards. So we're going to give it up. Yep. This is just a fact of business. If you want to do business with American services, you have to accept the risk that the American intelligence services might tap into that. So so be it.


Debbie Reynolds  19:38

Yeah, yeah. Right. Right. And I think especially for smaller people; I'm sure smaller companies that aren't kind of the Googles and Facebooks or whatever. You know, I know people hate when you say this, like what are even the chances that you'd ever have anything that anyone would be interested in? I just don't know.


David Heinemeier Hansson  19:58

I mean, it all depends on like, which services or who's your customers, who's your users, right? There's a reason everyone from like lawyers to human rights folks and so on use Signal these days because it does offer at least some of the best guarantees you can get. And it's always hard to know. But yeah, I think you still have to accept at least that the small companies and not even small companies and medium-sized companies, actually you have to be an enormous company to really have the wherewithal to even take on a project like figuring out how is this actually going to work? What are we actually going to change? So my advice would simply be, for now, ignore it, which is just a tragic kind of conclusion to a topic I care deeply about like; I would actually really like that if we had some principles that we stood hard on. But for any individual company, should they go out and hire a lawyer to do a root cause analysis of what the latest I forget the abbreviation? There's so many of these days for the latest announcement that came out of the White House, which is frankly, just a political declaration that doesn't even have anything beneath it. There's not actually a text. Right in court, there's nothing of substance actually, beneath it. Should they go out, hire lawyers to interpret that and figure out what to do? No, they should not. They should just ignore it.


Debbie Reynolds  21:24

So yeah, so when this came out, so I think it was announced between the EU and the US earlier in 2022. Then before the election, Wait, they say, hey, we okay, we're closer. So now we have kind of a framework. And it's so funny because I saw an article yesterday where I said, oh, now companies have like a clear path and what they're going to do between Europe and the US, and I'm like there is nothing there. There's not a word on a sheet of paper to give anyone any instruction about what they need to do differently or whatever. And I'm like, what are you guys talking about? This is just bananas.


David Heinemeier Hansson  22:03

But what's so fascinating about analyses like that is this is what everyone wants to believe to be true, desperately want to believe to be true that there now is clarity, that there is a framework that people have figured this out, that we're in the clear, because the consequences of realizing reality, as in it's a total mess, no one knows anything. Nothing has been decided, and none of the core principles have actually been put into action. It's not only depressing but expensive, disruptive, and all these other things, the things that businesses hate, right? Full of uncertainty, potential mega penalties, potentially mega bills from lawyers, like all the things that businesses hate, right? Is now a good time for us to inject that level of uncertainty into everything on top of what else is going on in the world? No, no, it's not. But the lack of courage for both sides to simply say, what's what, for the Americans to say, hey, you know what, our intelligence services are going to do whatever the hell they want. Because we're a sovereign country, and that's just how it is. Deal with it. And for the Europeans to either go like, yeah, okay, then we're just not going to allow our companies to use your services or, okay, fine, you got us on that one. Just have the courage to reach the obvious conclusions, but pick a path. No one wants to pick a path; everyone wants to pretend that things are being solved. Even though we don't solve anything, all we're doing is creating busy work full of intentions that are going to be tested by the European Court of Justice when Max Schrems inevitably files his appeal on this stuff. And they will probably reach the same conclusion again, yeah, you didn't solve the underlying discrepancy between how we view data and the rights to it?


Debbie Reynolds  23:48

Right, exactly. You know, my feeling is that if we do ever have an agreement, it just has to be more narrow, right? It can't be this huge thing, like this all-encompassing text, it's like, look, you know, we're going to do this, we're going to come up to some narrow path where we can say, you know, we try to protect people's rights. Also, one thing I want to mention, I want your thoughts on it as well. It's so funny. So when I talked to a lot of people in Europe, and they're like, well, we want, you know, with the Privacy Shield, or whatever the hell we're going to call it now. It's like the transatlantic agreement. We want to have the same rights that Americans have. I say Americans don't have rights. We can't go to court around this type of, you know, NSA takes our data. There is no redress for us, you don't have standing. So it's like, you want what we have and we don't have that right. So that's why they're trying to create these little courts within the Justice Department or within the FTC to try to make it seem like oh, you have like a court process or whatever it's like, we don't have that, right? You know, privacy isn't a fundamental human right in the US. And I can't go to court and say, hey, the NSA took my data, I need transparency. So, I mean, I just feel like we're going in circles around this whole issue.


David Heinemeier Hansson  25:21

We are and I think, especially this fact that we want the same rights, the rights to what? A kangaroo court that operates in complete secrecy and has a 97% win rate for the people requesting access to data. That's not due process. That's a kabuki theater of due process, right? That's not what you want. But I think it's also just realpolitik here to go. Can you get what you want? Can you get European-style principles enforced on another country? No, you cannot. There's all sorts of things both the US and Europe would like to change about China. Do they have the power to change those things? No, they do not. These are sovereign blocks. And they will absolutely set the course that they feel is right, whether through a democratic process or an authoritarian process or whatever process they have going on for them. But Europe simply does not have the right well, not the right, but the power to tell Americans on this level, and something as fundamental as this how they should go about it. So the only option they really have is to say, okay, well, we will just not allow our companies to deal with that. But they're not willing to do that, right? Because the consequences of doing that are just high. And this is just one of those things. Again, I want this to be true; I would love for us to have these basic human rights of privacy. It's something I care extremely deeply about when we built our email service hey.com. It was one of the things I put the utmost emphasis on, the way we dealt with encryption, the way the protocols that we had for employees, how to get access to data, the way we went in specifically to analyze emails that you received, find the spy pixels, that company used to detect when you opened your email where you were when you opened your email how many times you have been doing all this incredibly invasive privacy bugging that happens in email, we went in with hey.com. And we said like, we're not going to have any of that; we're going to lay down the land here. And when it came to the spy pixels, we would name and shame. So if you ever have a hey.com email address, and you receive an email from someone who's using a spy pixel, will detect that and say, like, hey, Debbie has tried to send you an email, but she included the spy pixel, you should just know that she can see when you open up all these other things like raising the bar here. So there's something I care so deeply about. But at the same time, I also accept the limitations of the international system. The US can't make China do very many things that it wants them to do. China can't make the US do a lot of things that it perhaps wants it to do; such is just nature and accept that that's the fact. But it's so politically inconvenient to simply go, hey, we have principles, but they're too expensive. It's too expensive for Europe to live up to its principles. So we will just dance around it, and make this little theater and pretend internally that we're living up to our principles, even though we accept the fact that reality is just different, that the Americans are going to do what they do.


Debbie Reynolds  28:26

That's true. It is very extensive. So this is an interesting story. So I guess there was a church in Europe that they decided they weren't going to do prayers from people anymore because they felt like it was going to be against GDPR. Like do they call a person's name now? See, this is just crazy, are you just gone off the reservation, so I feel like some people just take this way too far. It's like, you know, you still have to do business, you still have to have to make money so you can't fight you know, punch upward to an intelligence agency. So I think the best thing to do is try to find a way to do business as transparently as you can and not rack up all this, you know, busy work for yourself.


David Heinemeier Hansson  29:20

That's the part that really just offends me is when the busy work doesn't lead to any fundamental new protections, if all this work that companies were doing on standard contractual clauses, or any of the other stuff in the interim, analyzing Safe Harbor, analyzing Privacy Shield, analyzing Schrems verdicts, if any of that actually led to any fundamental improvements in privacy. For regular people, I'd go like, okay, fine, but it does not. All of that busy work is just sort of self-contained in its own little bubble. It has no impact on the real world. And that is just frustrating beyond belief that we're wasting human potential and human capacity in such unproductive ways. So let's stop doing that; again, my plea as a fellow business owner, who has one foot in Europe, and one foot in the US, ignore this, ignore this, for now, one day maybe there'll be some clarity and there'll be some real regulations that actually have an impact and help someone somewhere do something. That's the time to engage with it. Now, I understand that's not a play that can be heard by everyone and compliance and risk and all these other factors play in but at least to the small and medium-sized businesses. Just ignore it.


Debbie Reynolds  30:39

You're so feisty. Oh, my goodness, I totally love it. So. So if it were the world according to you, David, and we did everything that you said, what will be your wish for privacy anywhere in the world wherever, whether human stuff, technology, law, anything?


David Heinemeier Hansson  31:00

I think the fundamental right to privacy is something all humans should have. And to me, it comes down to two practical things. If you want the data of someone, you have to go through the courts; it has to be due process, and there has to be a warrant. And the warrant has to be specific. We cannot have general warrants. general warrants, by the way, it was one of these things that at the founding of the US, people were very much against because general warrants were you just have this dragnet, which was what Snowden revealed that was in place, that dragnet where you just pulled in everyone's data, guilty, not guilty, probable cause, no cause it just all gets sucked in. It's a dystopian, horrible place for us to live this notion that like, are they watching us? Are they listening to us in our everyday conversations? It's just an oppressive regime to live under. Again, there are criminals that commit crimes. And as such, during the investigations of those, it should absolutely be possible for law enforcement agencies to get a judge to sign off on a warrant. Fine, we're good so far. That's sort of the legal part of it. And then the other part of it for me is around the sort of the biggest mass-scale invasion of privacy, which is what happens under surveillance capitalism, to drive the advertising regime that we have on the Internet that, to me, is something that should be banned. This idea that you can target ads on the basis of people's personal information is what leads to all the abuses in the first place because it's what makes those abuses profitable. If it was no longer profitable because it was no longer legal to use that kind of data for targeted advertising, I think the whole regime would just collapse; it would be too much work, too much risk to wholesale, just aggregate all of this data, run it together, as we have all these cases, with Facebook combining all sorts of data sources together and buying data from here, there and everywhere. They do that because it informs a regime of extreme ad targeting; if that regime was not something that Coca-Cola and Pfizer and General Mills and all the other large advertisers could use because that was simply illegal, the whole thing would fall apart. We have to change the incentives. And currently, all incentives on the Internet is to collect as much as possible from everyone you can and hold it until eternity. That is a catastrophic vision of privacy. So if we deal with those two things, away with the general warrants, away with the dragnet surveillance, scale it back to targeted warrants, and ban surveillance capitalism in its current form. That to me, are the two most meaningful steps we could possibly take.


Debbie Reynolds  33:53

I agree with you wholeheartedly. I think you've been listening maybe to some of my work so I've said that for a long time and that you know, I think all this over surveillance of people is creating like a guilty until proven innocent type of thing where you just kind of fight your way out or something. You know, you against the algorithm says you did x, you know, like, oh, they would walk past the coffee shop that got robbed so he's a suspect. Now. You know, it's like ridiculous.


David Heinemeier Hansson  34:22

Exactly. And it's not a world anyone, I think really wants to live in. The other thing is this: I don't actually believe that this has broad public support that we go like, okay, we're going to trade our liberty and our privacy for a modicum of security. That modicum of security has repeatedly time and time again been disproven. One of the things that came out of the Snowden revelations was how often were these major plots being stopped by spying on everyone. And the number of times they could come up with any compelling cases where yep, here's a case where the dragnet totally worked and we stopped some atrocious thing. It basically didn't happen, right? So we're trading this enormous human vulnerability by giving up our privacy for what? For not even a material increase in safety. If we had suddenly gone from like, okay, every other day there's bombings and terrorist attacks and then boom, we put this dragnet in place and they stopped. Some people might go like, all right, I'm willing to trade some of my fundamental human rights to stop the madness. Just that this wasn't reality. So we're trading in most cases for a phantom idea of safety with a very real conception of privacy. It's a bad deal.


Debbie Reynolds  35:39

It is a false dichotomy, right?


David Heinemeier Hansson  35:45

Yes. Yes.


Debbie Reynolds  35:46

Trade between safety and privacy. So Well, thank you so much. This is has been an electrifying conversation. I love to find ways we can collaborate together. You're so much fun.


David Heinemeier Hansson  35:58

Excellent. Absolutely. I really liked being on your show. Thank you.


Debbie Reynolds  36:02

Thank you so much. We'll talk soon.


David Heinemeier Hansson  36:05

Alright, sounds good.