"The Data Diva" Talks Privacy Podcast
The Debbie Reynolds "The Data Diva" Talks podcast features thought-provoking discussions with global leaders on data privacy challenges affecting businesses. This podcast delves into emerging technologies, international laws and regulations, data ethics, individual privacy rights, and future trends. With listeners in over 100 countries, we offer valuable insights for anyone interested in navigating the evolving data privacy landscape.
Did you know that "The Data Diva" Talks Privacy podcast has over 480,000 downloads, listeners in 121 countries and 2407 cities, and is ranked globally in the top 2% of podcasts? Here are more of our accolades:
Here are some of our podcast awards and statistics:
- #1 Data Privacy Podcast Worldwide 2024 (Privacy Plan)
- The 10 Best Data Privacy Podcasts In The Digital Space 2024 (bCast)
- Best Data Privacy Podcasts 2024 (Player FM)
- Best Data Privacy Podcasts Top Shows of 2024 (Goodpods)
- Best Privacy and Data Protection Podcasts of 2024 (Termageddon)
- Top 40 Data Security Podcasts You Must Follow 2024 (Feedspot)
- 12 Best Privacy Podcasts for 2023 (RadarFirst)
- 14 Best Privacy Podcasts To Listen To In This Digital Age 2023 (bCast)
- Top 10 Data Privacy Podcasts 2022 (DataTechvibe)
- 20 Best Data Rights Podcasts of 2021 (Threat Technology Magazine)
- 20 Best European Law Podcasts of 2021 (Welp Magazine)
- 20 Best Data Privacy Rights & Data Protection Podcast of 2021 (Welp Magazine)
- 20 Best Data Breach Podcasts of 2021 (Threat Technology Magazine)
- Top 5 Best Privacy Podcasts 2021 (Podchaser)
Business Audience Demographics
- 34 % Data Privacy decision-makers (CXO)
- 24 % Cybersecurity decision-makers (CXO)
- 19 % Privacy Tech / emerging Tech companies
- 17% Investor Groups (Private Equity, Venture Capital, etc.)
- 6 % Media / Press / Regulators / Academics
Reach Statistics
- Podcast listeners in 121+ countries and 2641+ cities around the world
- Over 468,000 + downloads globally
- Top 5% of 3 million + globally ranked podcasts of 2024 (ListenNotes)
- Top 50 Peak in Business and Management 2024 (Apple Podcasts)
- Top 5% in weekly podcast downloads 2024 (The Podcast Host)
- 3,038 - Average 30-day podcast downloads per episode
- 5,000 to 11,500 - Average Monthly LinkedIn podcast posts Impressions
- 13,800 + Monthly Data Privacy Advantage Newsletter Subscribers
Debbie Reynolds, "The Data Diva," has made a name for herself as a leading voice in the world of Data Privacy and Emerging Technology with a focus on industries such as AdTech, FinTech, EdTech, Biometrics, Internet of Things (IoT), Artificial Intelligence (AI), Smart Manufacturing, Smart Cities, Privacy Tech, Smartphones, and Mobile App development. With over 20 years of experience in Emerging Technologies, Debbie has established herself as a trusted advisor and thought leader, helping organizations navigate the complex landscape of Data Privacy and Data Protection. As the CEO and Chief Data Privacy Officer of Debbie Reynolds Consulting LLC, Debbie brings a unique combination of technical expertise, business acumen, and passionate advocacy to her work.
Visit our website to learn more: https://www.debbiereynoldsconsulting.com/
"The Data Diva" Talks Privacy Podcast
The Data Diva E194 - Rex M Lee and Debbie Reynolds
Debbie Reynolds, “The Data Diva” talks to Rex M Lee, Tech Journalist, Security Advisor, My Smart Privacy. We discuss his career in the cellular phone industry, detailing his involvement in developing Houdini Soft, a platform that allowed unlocking and reprovisioning of devices. He discusses the legal battles and controversies surrounding the platform, emphasizing its role in empowering consumers to have control over their devices. Additionally, he connects the platform's relevance to cybersecurity and its impact on the industry's landscape, shedding light on the challenges and the consumer rights upheld in the 2009 Digital Millennium Copyright Act (DMCA) ruling. Furthermore, he delves into the broader implications of surveillance capitalism and data mining in the tech industry, highlighting the role of operating systems in supporting these technologies.
Rex Lee delves into the historical progression of operating systems, tracing the shift from iPods to smartphones and the implications for user surveillance. He emphasized the role of pre-installed apps in data collection and the subsequent adoption of a targeted advertising business model rooted in surveillance capitalism by major companies like Google, Apple, and Microsoft. Furthermore, he illustrates how these companies control global access to Internet trade and commerce, leading to centralization. Debbie Reynolds contributes to the discussion by highlighting the data collection capabilities of unused apps on phones, underscoring the far-reaching impact of surveillance practices. Rex Lee passionately discusses the exploitation of privacy, security, and civil liberties in the digital age, emphasizing the need for an electronic bill of rights to address these concerns. He advocated for individuals to have control over their personal information and opposed surveillance capitalism business practices, calling for the abolition of contracts of adhesion. Debbie Reynolds agrees with Lee's perspectives and appreciates the insights shared, acknowledging the importance of the discussion on privacy and security in the digital era. The conversation also touched on their professional backgrounds and, a mutual appreciation for each other's work, and his wish for privacy in the future.
49:09
SUMMARY KEYWORDS
companies, surveillance, information, app, technology, people, data, smartphone, data mining, device, platform, terms, facebook, product, metro pcs, apple, developed, alphabet, ftc, service
SPEAKERS
Debbie Reynolds, Rex M Lee
Debbie Reynolds 00:00
Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.
Hello. My name is Debbie Reynolds. They call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show, Rex Lee. He's a tech journalist and security advisor. He's also very prolific on LinkedIn, and I see a lot of your speaking engagements that you do, and I always think that you're amazing and the information that you share, but I'm happy to have you on the show, and please do tell us about yourself, your trajectory in tech, and how you became what you are now.
Rex M Lee 00:53
Well, thank you, Debbie. I really appreciate being on your show. Likewise, I've seen you all over LinkedIn, and we were able to connect, and so it's good to meet you in person. I think we've been following each other for several years now, and just to give your audience a little bit of background on myself, I've been in the tech and telecom industry for close to 40 years. I came out of college and I was working in the radio paging industry. My parents said there was no future in beepers back then, but that was the beginning of the business bell curve for paging and cellular phones, and so forth. I worked my way up. The first nationwide paging company was a company called Page America, and I worked my way up through sales, into general management through acquisitions and so forth. I ended up working for Carnegie Technologies, which was a company that had bought the paging carrier I was working for back in the late 80s, and I stayed on with them, pretty much off and on, for almost 30 years. We went on to build up several aging and cellular phone companies that were acquired by AT&T, T-Mobile, Metro PCS, and so forth, and what we decided to do in the mid-2000s was get into the app and platform development industry before people really knew what apps and platforms were. This was even before Android, OS, Apple, iOS, or smartphones were invented. One of the first platforms we developed was for reverse logistics, and this is relevant to my cybersecurity background. It was called HoudiniSoft, and it was very controversial. HoudiniSoft was a company that was considered to be the world's largest legal hacking firm at the time. Basically, our reverse logistics platform was centered on unlocking and reprovisioning flip phones at the time, and then later, smartphones, the Android devices, as well as Apple iPhones and any Microsoft device supported by Microsoft Windows 8, 10, and 11, we could unlock those whether the locks were encrypted by the manufacturers or they were personal encrypted locks by the end user. There was back then, before LTE, there was a need for unlocking devices, which gave the consumer control over their device, and this is where it got very controversial. In 2009, we were challenged by Apple, AT&T, and CTIA, the governing bodies of the wireless communications industry, regarding consumers to be able to unlock their devices from the manufacturers, and we believed it was the consumers, right? You own your computer, you own your smartphone, you should be able to have control over it, and you shouldn't be locked out of the device. Our first big carrier to adopt this was Metro PCS, and that spawned a lawsuit with Virgin Mobile trying to stop the technology and what it allowed Metro PCS to do, and the reason why locking was prevalent back then was because you saw manufacturers cutting exclusive deals with carriers like iPhones, when they first came out, you could only go to AT&T and buy an iPhone. You couldn't buy it on any other carrier. The same thing happened when Samsung Galaxy phones started coming out; you could only buy them through Verizon or T-Mobile. You couldn't buy them on the smaller carriers like Cricket or Metro PCS, and Houdinisoft solved for that. Somebody could buy an iPhone or a Samsung Galaxy Note phone and take it to an all-you-can-eat carrier, so to speak, an unlimited service provider for a set price, so you get unlimited voice and data for 49 bucks a month on these carriers that were being locked out of the exclusive phones, and Houdinisoft solved for that by unlocking the device and reprovisioning the device for the consumer, so it gave the consumer choice and the industry choice.
Debbie Reynolds 04:51
Oh, wow. Very interesting. Well, I think as you're talking about that, and some of the back and forth chats that we've had, I want your thoughts on how some of what's happening in tech around apps or even mobile devices is playing more into, I guess, what's termed surveillance capitalism.
Rex M Lee 05:14
Yeah, that's a huge issue so relevant to Houdinisoft. Houdini Labs were great. Every handset that came out, we had to develop it for the application because when Virgin Mobile sued us, we won that lawsuit with Metro PCS. They actually sued Metro PCS and were handed a subpoena from a law firm in New York, I think it was Scoggins, and they represented Virgin Mobile, basically, Richard Branson, so their retainers were $2 billion a year, but all they wanted to do was make sure we weren't violating any Copyright act. This falls into the Digital Millennium and Copyright act ruling of 2009. All the big tech, Google, Apple, all of them, and big telecom went to the Library of Congress and said it should be illegal for a consumer to unlock and reprovision their phone or flash it, jailbreak it, and flash it. So the Electronic Frontier Foundation, if you know who they are, they're the EFF, they represented Snowen regarding NSA, prison and all this flows into surveillance capitalism. It's very relevant. Had contacted us, and they said, hey, it should be the consumer's right to be able to retain control over their devices. So we need you guys to write the technical arguments from the Library of Congress that are being submitted by Apple, AT&T, and the governing body of wireless, CTIA, and so forth. So our handset development team and our lawyers got together and wrote the rebuttal to all of their arguments for the DMCA ruling of 2009 and our arguments were actually upheld by the Library of Congress, who thought also that it should be the consumer should have control over their computers, their smartphones and everything else. So we actually won the 2009 DMCA ruling retaining the right for the consumer to be able to unlock and reprovision their phone, or even jailbreak it and root their phone and get rid of all that pre-installed software. Well, the problem with the pre-installed software today, the operating systems today, are totally different. So when you look at surveillance capitalism and surveillance and data mining in general by the app developers and the OS developers, it all has to do with a targeted advertising business model rooted in surveillance capitalism, meaning that the Android OS, Apple iOS and Microsoft Windows 8, 10, and 11 OS are developed on an Open API architecture. I don't want to get too technical for people out there, but that simply means that these operating systems support surveillance and data mining technology in the form of apps and social media platforms and now centralized AI. So that's basically what's happening when you utilize an app, the app is what's conducting the surveillance and data mining on the end user,
Debbie Reynolds 08:07
Right, but then also, I don't want to get overly technical too, but I've noticed, especially after Covid, when they were trying to do a lot of those Covid apps, some of those changes that they were making, they started making them more OS level changes and chip level changes that made it basically impossible to opt out meaningfully to some of these things.
Rex M Lee 08:32
This actually started, if you want to look at how this started, in the 90s, with Alphabet. Their first product was the Google web browser, search engine, and what they wanted to do, and let's go back to that point in time. This is important. Most people view Alphabet, Google, Meta, and Facebook as technology companies because they develop tech, and they develop all these platforms, and you see their CEOs like Mark Zuckerberg, and early on, it was Eric Schmidt and Kelly and so forth from Google, and they all get on stage with T-shirts and jeans and tell you how their technology is going to change the world and better the world and so forth. Well, what a lot of people don't realize is that these aren't technology companies to begin with. They're global data brokers who operate within the trillion-dollar global information trafficking industry; if you look at them from that perspective, this is why they develop a free web browser. So if you looked at Alphabet, their first initial business model was to collect as much information on people as possible. Usually, this was done early on by companies like Lexis-Nexis and so forth through consumer purchases, credit card information, getting your footprint, your telephone calls, and all this other stuff. This is how they were collecting information on people, while the web browser became the tool that you could collect the most information on somebody, then social media with MySpace, pre Facebook, I hate to date myself for your listeners. But if you remember MySpace, that was the first social media platform, and then later on, Mark Zuckerberg launched Facebook with Sean Parker, the gentleman who launched Napster and invented Napster. They got together on Facebook, and that became the standard. They, too, were in the data mining business. They are data brokers. That's the business they're in. So what these companies do is they develop a surveillance and data mining platform, first that attracts the honeybees, in essence, is the application that they overlay to the surveillance and data mining platform, which is social media. That's the honeypot, and the bees all come and use the free product or the free web browser. There's nothing free in this, and back then you would get a free product, and in return, you're giving up the information associated with your use of the free product to them. Well, that later proliferated to the Apple iOS, which first supported iPods. IPods were the first connected device with an operating system, not a web browser, a social media platform, but operating system that you can actually surveil the end user. I think it was done for, back then, it was a good purpose for the iPod to understand what music people like, not to spy on them, but to be able to sell them music that they liked. I think that's how it all started out. But then later on, the iOS proliferated to a cellular phone and a computer, which we call a smartphone. Then you can imagine if you could get a lot of information about people from the web browser and their use of social media; imagine how much information you can get about people from their telephones and their computers; it's unlimited. It's all of that information. So once that proliferated to a phone operating system, and then that became acceptable, then you saw the Android OS evolve and became developed by Alphabet and Google. They wanted to get in on that as well. They didn't want Apple to control that market, and that literally changed Microsoft. At one point, if you remember, Microsoft was being left in the dust, and Windows 7 was the last secure operating system that Microsoft had developed. It was done on a closed API architecture which gave full control to the end user. Do you remember back then, with Windows 7, that you could actually uninstall all of the bloatware that came on your device? Remember that time?
Debbie Reynolds 12:28
Yeah, I remember.
Rex M Lee 12:29
So when you bought a new computer, you were setting it up, you know, I don't want this or that, well, that all changed with Windows 8, and you never had that choice with iOS or Android, initially, the pre installed apps you couldn't touch because they were loaded at the firmware level, so you really couldn't control them or uninstall them. So now you were forced to take the bloatware that came on those phones, and the bloatware was designed for surveillance and data mining so they can sell the information to targeted advertisers and make gazillions of dollars. Well, that happened, and that evolved to Windows, and with Windows 8, 10, and now 11, they're all backed by a targeted advertising business model rooted in surveillance capitalism. Now, the fallout from all this is it's led to these large monopolies. A lot of people ask me about Internet centralization. You've heard that term, right? The Internet is centralized.
Debbie Reynolds 13:25
Yeah, right.
Rex M Lee 13:26
A lot of people, they don't know what that means. It's like, okay, Rex, it's not really centralized, because I can, in fact, bypass Google, Apple, Microsoft, and type in a URL and go to any website that I want. So I have that freedom, correct?
Debbie Reynolds 13:41
I don't think that's true.
Rex M Lee 13:44
Yeah, well, you can. I mean, if you wanted to go to your bank and not use the app, and that's the trick here, you can still type in, let's say, any bank, like Frost Bank, www.frost.com. That gives me control to bypass Google, Apple, and Microsoft. But when was the last time you typed in a URL to go to any business? You don't do that anymore.
Debbie Reynolds 14:06
You search, and then you click on the link.
Rex M Lee 14:10
Also, what happens after you establish your account? Now, you go to the app. Everybody has an app, so that's Internet centralization. You either go to those businesses through Google's web browser or Microsoft's web browser or Safari, Apple's Web browser. So now you have to understand that through Internet centralization, there are actually three companies that control global access to Internet trade and commerce. It's Google, Apple and Microsoft. So a lot of times, people hear the term that, oh, well, I'm first sold to advertisers when I set up a Facebook account, and everyone says they know that. They know they're being surveilled in data mines. So they just say, well, I'm first sold to these advertisers through my Facebook account. Well, that's not true. Facebook had to go and I hate to position Google, Apple and Microsoft like this, but they basically are like the mafia families. You have to go to the Gambinos, the Sopranos, and Corleones to get access to the OSN users, so Meta to launch Facebook, to get to you and I, they first have to go to Google, Apple, Microsoft, and cut what's called a pre-installed app deal that cost billions of dollars, which they take the Facebook app and then they pre install it into the operating system that's pre-installed into billions of devices. That's why, when you buy an Android phone, an Apple phone, or a Microsoft computer supported by 8, 10, 11, you see the Facebook app has already been pre-installed in there, or Tiktok or Amazon; those are the companies rich enough to go to the Gambinos, the Corleones and the Sopranos and pay them billions of dollars so that their apps are pre-installed. Why do they do that? That gives them a huge competitive advantage over any other app or social media platform out there. When you're already on the device, nobody has to go to the App Store to look at what social media platform they want. So that gives them full control, and if you or I were to start up a social media platform, do we have 10, 20, $30 billion? That's what it literally costs to go to these companies and, say, pre-install our apps. No, now we get shoved into the App Store with millions of other apps, and we're lost in the static, and this is how, through centralization, you only see a handful of companies that are controlling global access to the Internet, trading, commerce, and unfortunately, all of these companies relevant to surveillance capitalism have adopted a targeted advertising business model rooted in surveillance capitalism.
Debbie Reynolds 16:48
That's all true, and then also, one thing that people don't understand is even if you have an app on your phone, you don't even have to use it for it to collect data. The fact that it's on your phone collects the data. So I know some people say, oh, I have this app, but I don't use it. It's like, well, it's still collecting the information. They're sharing the information with one another.
Rex M Lee 17:10
You made a comment earlier that's highly relevant to this. So first of all, when I do a security session, I do security sessions all over the world. I'm also a former advisor to the Department of Homeland Security, the National Security Agency or NSA, other government agencies, as well as lawmakers. Now, I've been an advisor for lawmakers regarding congressional hearings involving tech giants and CEOs like Mark Zuckerberg, CEO of Meta; Sundar Pichai of Alphabet, Google; Jack Dorsey, formerly of Twitter, now X, and so forth. I was involved in four congressional hearings dating back to Facebook, Cambridge Analytica, and the last one I did, I was an advisor for Senator Blackburn's office for the Facebook whistleblower hearings with Francis Haagen, and one of the problems that people fail to realize what's going on out there and why all this is relevant is something you said earlier, that when you start to use your technology, you're forced into it by having to agree to a set of terms of service, correct? You have to click on, I agree to use any of this technology. Well, what you're agreeing to do is give up your privacy, security and even your safety, because these companies do not indemnify their end users, even though they collect all of your personal and business information, package it and sell it for trillions of dollars. They don't want to be held liable if somebody were to hack that information, like Jennifer Lawrence, whose iCloud was hacked. It wasn't just all her photographs that were collected. It was her address book, her files. I mean, if these were nefarious characters, like a criminal organization, they could have kidnapped her family members through knowing who they are, their addresses, having their phone numbers. All that information was hacked along with the photographs. We mainly heard about the photographs, well, she learned something that everybody else didn't know was the fact that when she tried to hold Apple liable, she couldn't hold Apple liable, because she clicked on, I agree, and realized that she was not indemnified. So even if they're neglectful with your information and it got out, and God forbid, somebody got harmed over it, you still cannot hold them liable because you've agreed to this. Now, this is a sneaky way to circumvent privacy laws, GDPR in Europe and the California Consumer Privacy Act, and so forth. They only protect you when you access the platform by going to www.facebook.com, and you get on Facebook; then, the privacy laws kick in to protect you when it's called online privacy. However, what the privacy laws and what the tech companies have done and they said, that's fine. They can go to the platforms using URL all day long, and the privacy laws are going to apply there. Where they don't apply is when you enter into an agreement and enter their domain. You have to understand this when you use that app on your phone and you accept, I agree, you're in their house, you're in their domain. You're no longer on the Internet where you are protected by privacy laws, so the terms of service actually circumvent the privacy laws. So if you were to go back now and say, hey, they're still collecting data, they're still surveilling me when I had a conversation about my medical problems, I got home, and I started getting medical ads. I thought I was protected, and you took them to court over that they're going to now go back and say, well, you should have just accessed Facebook through www.facebook.com we give you that option, but you clicked on the app and you accepted the app permissions. Here's the app permissions, we're allowed to collect your contact information. We're allowed to collect your text messages. We're allowed to gain control of your camera, your microphone and all the sensors on your device, including your accelerometer. So this means they can do audio, video and physical surveillance on you. 24 by seven, 365 days a year, collecting over 5000 highly confidential data points associated with your personal business, medical, legal, employment, biometric, and location information, and they can package that and sell it to targeted advertisers, third-party data brokers and so forth. So this is a big, huge issue. The terms of service are actually defined in legal terms as a contract of adhesion, meaning that if you don't accept the agreement, you can't use the products and services that you're paying for, like your smartphone, your Smart TV, your connected appliances, the electronics in your connected vehicle and so forth. If you reject that contract of adhesion, you cannot use the products or services that you're paying for.
Debbie Reynolds 22:05
Oh, my goodness, that's terrible. I knew part of that, but you put a fine point on that, for people who don't really understand, what's your thoughts about the need for an electronic bill of rights? I want to know more about that.
Rex M Lee 22:18
Herein lies the issue. Back in 2013, I left Houdinisoft once LTE came; there was no need really for Houdini soft anymore because the manufacturer locks were locking CDMA phones, primarily, and that was our solution. Once everything became universal with LTE, there was no more need for Houdinisoft and I got into security consulting and running the world's largest legal hacking firm, Houdinisoft, you can imagine what we discovered in our labs. This is where I discovered how much surveillance and data mining the app companies were conducting on the end users and all of that. I ended up working for a defense contractor, doing my first consulting agreement with Space Data Corp out of Chandler, Arizona, and this is highly relevant because back in 2009, Alphabet, Kelly, and some other founders had visited Space Data. They liked their technology. What Space Data did was they built high altitude communication platforms for the military that were used in the theater of war in Afghanistan and Iraq. So they led Space Data down the primrose path of acquisition, and one of the first meetings that Jerry Knoblauch, the CEO, asked me to attend was a board meeting. He came into the meeting and he was very angry at the fact that they were in competition with Alphabet. They said Alphabet launched the Google Loom program, which was basically Space Data's technology to provide Internet services in remote areas of the world, and not for military use. But Space Data did the same thing for communications in remote areas of the world where there were no infrastructure. You can use Space Data as infrastructure, and after the meeting was over, I asked Jerry. I said, well, most of your executives and board members are using Android devices, and your contractors, like myself, can bring any device that I want to use. These devices are unsafe, Jerry; they're surveillance and data mining. What's worse is, who've developed the Android? OS, it's your competitor now, Google Alphabet, so you're bleeding your company's information through your endpoint devices, your computers if you're using Chrome web browsers, and so forth. Long story short is he said, can you prove that? I said that not only are you exposing your information to unauthorized third parties, but your business competitors, including companies from China. He asked me, can you prove this, and I said, yeah, I'll reverse engineer the Samsung Galaxy Note smartphone. So I reverse-engineered the Samsung Galaxy smartphone from an application standpoint, not to do anything illegal with it, and at the end of my report, I had identified 18 multinational companies, including a company from China, Badu, that were enabled to monitor, track and data mine everything I did on that device, whether it was a personal activity, a business activity, whether I was visiting my doctor and so forth, it didn't matter. They were collecting all of that information and they were monetizing it for profits. So, I was the first person to discover pre-installed Chinese surveillance technology in a mainstream smartphone, and then that went on. Jerry understood that. They actually went on to sue Alphabet, and they were one of the few companies to win the lawsuit against them because they were both in their engineers and IP and all that. But that led me to become an advisor for DHS. I've met a DHS agent at a utility trade show, and showed up my smartphone report, and then David Nolan, the head of BHS science and technology in 2016 asked me to be an advisor for the Department of Homeland Security Science and Technology study on mobile device security. Through that smartphone report, I understood that there were several things going on here that were out of the control of the product owner. One, you had no control over the pre installed apps. Two, if you rejected the terms of service, then you couldn't use the product that you were paying for. So when I was able to finally go public with this information, I had to prove my smartphone findings. I actually proved them through T-Mobile and Verizon. I'm the only person in the world to get an admission from T-Mobile and Verizon that smartphones, tablet, PCs and connected products supported by the Android OS, Apple iOS and Microsoft Windows 8, 10, and 11 OS are not private forums of telecommunications or computing due to uncontrollable pre-installed surveillance and data mining technology that the end user cannot uninstall. So once I went public with that information, CBS News, a gentleman, Kevin Tedesco, over there, 60 minutes they were talking to me about doing a story about me finding the Badu preinstalled surveillance technology and the Android device and my report overall, that got crushed, I don't know, they just stopped talking to me. So that led to other publications, like Mission Critical Communications magazine, doing a story on my findings, and then Epic Times and so forth. Other publications started doing it, but the mainstream media companies weren't publicizing my findings. Why? Because if you look at the biggest advertisers on their networks, it's Google, Apple, Microsoft. Well, my findings implicated all of those companies, Samsung, Android, Apple, Google, and so forth, and that led me to write a two-page article on the need for an electronic bill of rights that would protect consumers from predatory surveillance and data mining business practices rooted in surveillance capitalism. Where we're not protected is our inability to not accept the Terms of Service associated with products that we're buying. So if you go way back in our early discussion. This all started with free products and services. I'm okay with that. You give me something for free, and you want my information to some extent, only when I'm using your product, not 24 hours a day, not my medical information, not my business information, only the information associated with the use of your app or your platform. If you want me to trade that for you for a free product, I'll do that. That's not happening today. Your smartphone costs you thousands, if not tens of thousands of dollars over time through the voice and data plans, and your connected vehicle costs you tens of thousands of dollars. Your smart TV costs you thousands of dollars. This predatory business model has proliferated to all the products and services that we're paying for today, which means we can't use those products if we don't want to accept their predatory Terms of Service. In essence, I hate to use this term, but we've all been like the movie The Matrix. We're enslaved to the Silicon Valley matrix producing personal and business information, medical information, biometric information, location information, we're producing, all of this information that these companies are collecting by way of the products and services that we're paying for, and they're not compensating us for that. This is the definition of cyber oppression and tyranny, because now it's proliferated to my products before they could say, well, don't participate. Well, name one product that's not supported by the Android OS, Apple iOS or Microsoft Windows OS, and good luck if you went to a private or a secure Linux OS, like Pure OS, where there's no surveillance or data mining, who are you going to communicate with or connect with using Linux when the world's standard is Apple iOS, Microsoft Windows operating system, and Android OS, that's standard in the world, and we're all being forced into this predatory business model. This is why we need an electronic bill of rights, mainly to target these contracts of adhesion, which are the term. A service we're forced to accept when we click on I agree.
Debbie Reynolds 30:05
Yeah, I completely agree with that. I never heard anybody actually explain it that way about adhesion. The thing that concerned me about these contracts is that if a contract is so asymmetrical, it's astronomically asymmetrical. Let's say I download your app; maybe I use it for a week or so, but then should I give up my first child as a result of that? So, the value exchange is completely off the rails in terms of the value that we get versus the data that we give, and then it just goes out in a while.
Rex M Lee 30:39
I spoke at AES, which is a medical conference in Chicago in February. There are over 400 representatives and doctors, dental labs, and everything at this conference, and I like to ask people this: would you define Google, Apple, Microsoft, Meta, Amazon, Bytedance, the developers of TikTok as technology companies? 95% of the hands go up; oh, yeah, these are tech companies. You see their CEOs out there with their T-shirts trying to act like you and I, and oh, our technology is going to change the world. Behind the closed curtain, like in The Wizard of Oz, when you look behind the veil, you realize that these aren't technology companies. They are international, global data brokers working in the trillion-dollar information trafficking industry. That's how Wall Street values these companies by how much information they can collect from their end users, and you nailed it right there when you said, hey, my benefit is minimal compared to the benefit that they're getting. That's exactly right. They're developing their technology more for their benefit than even they're paying customers. You can't say end user anymore when you're talking about somebody who buys a smartphone or connected vehicle. Now, again, when you look at the terms of service, I was asked by an attorney about looking at whether the terms of service was a legal consumer contract. He asked me, he goes, can you do an analysis on the Texas Deceptive Trade Practices Act or the FTC regarding consumer contracts? I found 24 violations of the Texas Deceptive Trade Practices Act associated with my analysis of the terms of service that supported the Samsung Galaxy Note that I reverse-engineered. I wanted to see if all this data mining was legal, and did I in fact, agree to it? Well, first of all, when I added up the contract, it came out to over 3000 pages of complicated legal ease that was written in a torturous manner that I couldn't understand. It took me basically four months to understand all the legalese associated with each app because when you click on I agree, you're not accepting just the app agreement. You're accepting the overall agreement for the operating system and the 18 companies that were responsible for the pre-installed apps. You accept all of their agreements and all of the legalese in two sets. Now, here's what people don't understand. There's one set of legalese that you accept that's very transparent to you. What's transparent to you are terms and conditions, online privacy policies, and end-user licensing agreements. These are all published online. So when you go in to accept the agreements, you say, okay, I'm going to click on their privacy policies, and I can read their privacy policies. In some instances, you do get some controls over the pre-installed apps through app managers and so forth, and so you can control some of it, but not all of it. What's unbeknownst to you, Debbie, and everybody else is you actually agree to another set of terms of service that's not transparent to you in the form of in-device, application permission statements, and application product warnings. There are actual product warnings associated with the use of these apps that are hidden in the device, within the end device, legalese. So what they do is they split up the language. The most important language that tells how much surveillance and data mining they're doing on you, including the application product warnings are within the application permission statements that are hidden within your device, and they actually contradict the online privacy policies. I'll give you an example. When you use an Android app and it has a personal information application permission associated with it. It will tell you online that your identity is protected by alphabet Google. It'll state that when we share your information with a third party, you are not identified. However, when you use the Google app, it states the exact opposite. It says. Is that they can share your information with third parties, they define as others, and they can identify you. So it states in there that they can identify you. I can show you the online language and the in device application languages. If you'd like to see that, if I could share my screen, I could show that to you. Because when people see this? Yeah, everybody's in the show me state. I can actually show that legalese to you if you want to see it.
Debbie Reynolds 31:13
Well, let's finish the recording first. But yeah, I believe you.
Rex M Lee 31:33
Yeah, I'll show you all this. It's in my presentation deck. There's a lot more. I'll even show you the application warnings. Also, one of the warnings warns you to censor your speech. Now, here you bought a telephone, here you bought a computer, and now there's a warning that's not online in the privacy policies or the terms and conditions. It's hidden in your device, warning you that they're sharing information with third parties and that you should actually censor your speech. Now, once you think that product warnings should be published online and not hidden in the device, this would be tantamount to a cigarette company hiding the product warning within the packaging of the device. So after you consumed the product, you realized you gave yourself cancer because the warning was on the inside of the product. Now that, to me, was one of the violations I found within the terms of service with 24 other violations, and I could not get either the FTC to pursue this, nor the Texas AGs office, Ken Paxton's office, I went there to show them, I actually sent it and filed a complaint with them, and they were not responsive to my complaint. So I went to the Texas AGs office to say, hey, I've identified all of these violations of the Texas Deceptive Trade Practices Act associated with these terms of service that support my smartphone no different than if I were signing an illegal contract to buy a car. If you found out that that contract was illegal when I bought the car, you would shut the dealer down. So here's what I found in my phones: hidden product warnings, application permission statements that aren't published online that tell me how much surveillance is conducted on me, and so forth. Pre installed Chinese surveillance technology in the device. They literally asked me to leave their office. They led me out, and I couldn't understand, why is this going on, nor the FTC. Well, it's the tech lobby so big and they've spread the money around that it's almost impossible to get the FTC or state AGs to enforce existing consumer protection laws. This is why we've had over 30 congressional hearings about nefarious acts regarding social media, app developers, Google, and so forth. The most recent one was child exploitation involving Mark Zuckerberg, where he actually apologized to the parents of the kids who were harmed or killed using Instagram or Facebook at the congressional hearing in January. Why is that? Why were they harmed or killed? Because these apps and platforms are addictive, and they're supported by brain-hijacking technology that's associated with manipulative advertising technology. That's what's addicting these kids, and not just kids, adults, they make it to where you scroll downward, like a slot machine. It's like a slot machine. It goes down and you scroll, waiting for that reward, the thumbs up and so forth. Those are social validation feedback loops and intermittent variable rewards designed to addict the end user to the platform so that they can continue to get the end user back on the platform. The other thing they do is, when you finally do put this down for a while, the notification will notify you that, hey, somebody commented on one of your posts, and then you pick it up because of the notification. They designed the algorithms to be divisive purposely so as a result of the brain hijacking technology, social validation, feedback loops, and the divisive nature of the algorithms, kids, whether it's on Tiktok, Instagram, Facebook, or Snapchat, they're being addicted, harmed and even killed by the use of this technology. Now in closing, I know we're getting long in time in closing, this is important. There's no other consumer product out there that can be intentionally developed. Sean Parker has admitted that they intentionally developed Facebook using brain-hijacking technology. Tristan Harris he's on the social dilemma documentary that Netflix did on the addictive nature of social media. Tristan Harris was a former product manager for Google. He admitted this multiple times over the last 10, maybe 13 years, that this technology was harming and even killing kids. And then Francis Haagen also testified. Applied to this fact during the Facebook and Instagram whistleblower hearings. Yet not a single executive, not a single social media company, has been held accountable for the development of these addictive apps and platforms in light of that, now if you or I were to open up a car seat manufacturing company and we manufactured 10 million car seats and sold them last year, and of the 10 million, only five car seats were defective. Four injuries to a child, and one kills a child. This is how the FTC would approach that situation. The FTC would require you to recall all 10 million car seats, not the handful that harmed or killed somebody. They make you recall all of that same thing with Takata airbags; they injured about 36 and killed about 12 people. The Takata airbags were in over 200 million cars. They had to recall all 200 million of those airbags. Yet these social media companies can put a highly addictive and harmful product out there have their product designers testify in Congress that the products are addicting and harming and killing the end user, and not one government agency responsible for consumer protection, FTC or State AGs have made them recall their social media platforms and shut them down until they're safe to use. They allowed to continue to go on, and that's due to the tech lobby and how much money these technology companies are spending and, oh, by the way, I was part of the section 230 hearings about these companies not being held liable for what you post that has nothing to do with consumer products, regarding a product that's defective at all, section 230 does not protect these companies from the liability for their product for harming somebody. That's the FTCs ball game, and that is the State AGs that control the state consumer protection agencies. That's their responsibility is to enforce those consumer protection laws, and has nothing to do with Section 230.
Debbie Reynolds 42:12
Yeah, wow. That's a lot to digest. I want to point something out that is really interesting. What you said triggered this with me, and this is what I'm seeing. So the comparison that you made, I think a lot of this is buried in legal theory around what harm is, especially when you're talking about privacy and different things like that. A lot of these lobbies come out of the woodwork if you're trying to define harm in a way that's not tangible. They don't think about it in terms of mental harm. So this has to be some type of physical harm or something that is more visible in some way, and so I see that a lot. But what are your thoughts?
Rex M Lee 42:56
So let me go back to that. Up until the Facebook and Instagram whistleblower trial with Francis Haagen, you could state that; however, what the public doesn't understand is that Francis Haagen turned in an internal document from Meta’s own research that showed a 13% harm rate with the use of their product once they did that, that theory goes out the window. They themselves conducted their own internal investigation. Now, 13% I went back to five car seats, injuring five people out of 10 million. That's .01. That's barely measurable in terms of the threat that those car seats presented to the consumer in that you take a 13% harm rate with 2 billion people on their platform, and you're talking tens, if not hundreds of millions of people that have been harmed, injured or killed. Why do you think Mark Zuckerberg felt compelled to turn around in Congress and apologize up until the internal document that she gave Congress and until his apology? Three years later, that goes out the window right there. That's an admission of guilt in 2017. You can also look at an admission of guilt. Sean Parker's admission about social validation feedback loops being harmful was admitted during an Axios interview. He actually said, in the Axios interview, that God only knows what this is doing to our children's brains. It was me, it was Mark Zuckerberg, it was Kevin Systrom. It was all of us inventors knew this, understood it, and we did it anyway. He literally says that then looks at the camera and smiles, and he goes on to gloat about the billions of dollars that they were making at the expense of the end user’s privacy, security, and safety. See, a lot of people think that losing privacy is no big deal. That's not what's going on here. We're being exploited at the expense, of our privacy, security, safety, and even civil liberties. When you look at the control of speech on these platforms, whether you're right or left, they're controlling speech, and that is, to an extent, how the government is eliminating human rights like free speech and freedom of the press. They do it through proxy, because they know everybody communicates on these platforms today. They get their news from these platforms, so they lead the dirty work up to these private companies and say, well, it's not the government that's surveilling and data mining all these people and controlling their speech and controlling freedom of the press. That's not us, but that's how we're all getting our news, isn't it? We're all communicating on these platforms, and again, all this draws back to the need for an electronic bill of rights for the 21st century, because our lives are online, our jobs require us to be online today. That wasn't the case 20 years ago. Today, if you want to make a living, you have to use this technology, and most of your life is required to be online today.
Debbie Reynolds 46:04
Yeah, oh, wow. So, if it were the world according to you, Rex, what would be your wish for privacy anywhere in the world, whether that be regulation, human behavior, or technology?
Rex M Lee 46:17
I think when it comes to products and services that you pay for, I'm not against surveillance and data mining business practices. At Carnegie, we developed mobile wallets. We were the seed investors for that. Paul Posner, the CEO of Triller. It was the social media platform that sponsored the Mike Tyson fight a while ago and all that. I'm not against it. What I'm against is the contracts of it. He's enforcing you into it by way of products and services that you're paying for. You should be able to control that and disagree with these surveillance and data mining business practices through the terms of service. So, I think, ultimately, the contract of adhesion needs to be abolished, and people need to agree to it and disagree and still use the products and services they want. I firmly believe that people should be able to sell their personal information if they want. My personal information. Pay me for it, if not. Pay for my TV, pay for my connected car, pay for my smartphone, pay for my recurring subscriber voice and data plans on all of these products and services you pay for all that. I'll give you all the information you want, but until then, stay away from my personal business, medical, biometric and location data on products that I'm paying for. If I don't want to agree to your predatory surveillance and data mining business practices, that is what needs to be put into the California Consumer Privacy Protection Act, GDPR, or any other privacy legislation, is the full control over anything supported by a surveillance capitalism business model, and full control over the ability to reject or accept these agreements.
Debbie Reynolds 47:51
I agree with that completely. I agree well, thank you so much. This is amazing. I'm glad that you're able to share this information, and I'm sure the audience will definitely enjoy the things that you say. I know a lot of people don't know this area nearly as well as you do, and so glad you brought all the facts here. Great.
Rex M Lee 48:09
Yeah, I was on the bad guy side as an app and platform developer. That's one thing that separates me from anybody else. When I write articles about this, or I speak on it at a trade show, or I do an online investigative report. I do it from an industry insider's perspective. I'm in the app and platform business today as a consultant. I'm still in this industry, so what I like to do is divulge information from somebody who actually has done this for a living, not somebody looking at the industry from the outside in so thank you very much for having me on if people want to learn more, they can go to www.mysmart privacy.com that's my website, www.mysmartprivacy.com,
Debbie Reynolds 48:53
Thank you so much, and I'm sure we'll be in touch soon.
Rex M Lee 48:56
Definitely.
Debbie Reynolds 48:57
Thank you. Thank you.