"The Data Diva" Talks Privacy Podcast

The Data Diva E188 - Arielle Garcia and Debbie Reynolds

June 11, 2024 Season 4 Episode 188
The Data Diva E188 - Arielle Garcia and Debbie Reynolds
"The Data Diva" Talks Privacy Podcast
More Info
"The Data Diva" Talks Privacy Podcast
The Data Diva E188 - Arielle Garcia and Debbie Reynolds
Jun 11, 2024 Season 4 Episode 188

Send us a text

In episode 186 of “The Data Diva” Talks Privacy Podcast, Debbie Reynolds talks to Arielle Garcia, Director of Intelligence, Check My Ads Institute, about Data Ethics, Responsible Media & Tech. We discuss Arielle’s journey from law school to becoming the founder of ASG Solutions, specializing in privacy and data ethics. We explore the brokenness of the advertising industry's incentive system and the challenges it presents, highlighting the impact on data practices and effective marketing. We also discuss the ethical implications of utilizing AI and voice recorders for medical transcription, expressing concerns about the marketing industry's lack of accountability and responsible use of data. We emphasize the need for consent and legitimate purposes for data usage, highlighting the broader issues within the chain of events that often lead to the misuse of data. The conversation delved into the complexities of balancing competition and privacy, expressing concerns about using sensitive data in advertising and the challenges of enforcement. The discussion also addresses the implications of AI on privacy, expressing concerns about the lack of transparency and accountability in big companies and ad tech. We explore the challenges and potential negative outcomes associated with Google's Privacy Sandbox and Performance Max products, emphasizing the conditioning of marketers to relinquish control and transparency. Additionally, we discuss privacy and technology in the femtech industry, addressing the implications of data collection and the need for stringent safeguards to protect sensitive information. They advocated for banning the surveillance advertising business model, citing its detrimental impact on human rights, civil rights, democracy, and her hope for Data Privacy in the future.

Many thanks to the Data Diva Talks Privacy Podcast Privacy Visionary, Smartbox AI, for sponsoring this episode and supporting our podcast. Smartbox.ai, named British AI Company of the Year, provides cutting-edge AI, helps privacy and technology experts uniquely master their Data Request challenges, and makes it easier to comply with Global data protection requirements, FOIA requests, and various US state privacy regulations. Their technology is a game-changer for anyone needing to sift through complex data, find data,  and redact sensitive information. With clients across North America and Europe and a major partnership with Xerox, Smartbox.ai is bringing their data expertise right to our doorstep, offering insights into navigating the complex world of global data laws For more information about Smartbox AI, visit their website at https://www.smartbox.ai. Enjoy the show.

Support the show

Show Notes Transcript

Send us a text

In episode 186 of “The Data Diva” Talks Privacy Podcast, Debbie Reynolds talks to Arielle Garcia, Director of Intelligence, Check My Ads Institute, about Data Ethics, Responsible Media & Tech. We discuss Arielle’s journey from law school to becoming the founder of ASG Solutions, specializing in privacy and data ethics. We explore the brokenness of the advertising industry's incentive system and the challenges it presents, highlighting the impact on data practices and effective marketing. We also discuss the ethical implications of utilizing AI and voice recorders for medical transcription, expressing concerns about the marketing industry's lack of accountability and responsible use of data. We emphasize the need for consent and legitimate purposes for data usage, highlighting the broader issues within the chain of events that often lead to the misuse of data. The conversation delved into the complexities of balancing competition and privacy, expressing concerns about using sensitive data in advertising and the challenges of enforcement. The discussion also addresses the implications of AI on privacy, expressing concerns about the lack of transparency and accountability in big companies and ad tech. We explore the challenges and potential negative outcomes associated with Google's Privacy Sandbox and Performance Max products, emphasizing the conditioning of marketers to relinquish control and transparency. Additionally, we discuss privacy and technology in the femtech industry, addressing the implications of data collection and the need for stringent safeguards to protect sensitive information. They advocated for banning the surveillance advertising business model, citing its detrimental impact on human rights, civil rights, democracy, and her hope for Data Privacy in the future.

Many thanks to the Data Diva Talks Privacy Podcast Privacy Visionary, Smartbox AI, for sponsoring this episode and supporting our podcast. Smartbox.ai, named British AI Company of the Year, provides cutting-edge AI, helps privacy and technology experts uniquely master their Data Request challenges, and makes it easier to comply with Global data protection requirements, FOIA requests, and various US state privacy regulations. Their technology is a game-changer for anyone needing to sift through complex data, find data,  and redact sensitive information. With clients across North America and Europe and a major partnership with Xerox, Smartbox.ai is bringing their data expertise right to our doorstep, offering insights into navigating the complex world of global data laws For more information about Smartbox AI, visit their website at https://www.smartbox.ai. Enjoy the show.

Support the show

30:57

SUMMARY KEYWORDS

data, privacy, people, sensitive data, advertising, harming, identifiers, work, industry, companies, happening, regulation, doctor, ai, ad tech, interesting, cookie, fascinating, years, transparency

SPEAKERS

Debbie Reynolds, Arielle Garcia

Many thanks to the Data Diva Talks Privacy Podcast Privacy Visionary, Smartbox AI, for sponsoring this episode and supporting our podcast. Smartbox.ai, named British AI Company of the Year, provides cutting-edge AI, helps privacy and technology experts uniquely master their Data Request challenges, and makes it easier to comply with Global data protection requirements, FOIA requests, and various US state privacy regulations. Their technology is a game-changer for anyone needing to sift through complex data, find data,  and redact sensitive information. With clients across North America and Europe and a major partnership with Xerox, Smartbox.ai is bringing their data expertise right to our doorstep, offering insights into navigating the complex world of global data laws For more information about Smartbox AI, visit their website at https://www.smartbox.ai. Enjoy the show.


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.


Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show, Arielle Garcia. She is the founder of ASG Solutions. She's also an Advisor/Advocate. She focuses on privacy and data ethics, and she's an expert in media and tech. So welcome.


Arielle Garcia  00:42

Thank you for having me, excited to be part of this.


Debbie Reynolds  00:47

I'm excited to have you on the show. I actually had the pleasure of seeing you speak in New York a couple of years ago at a Ketch event. I think the IAB was having a conference or there was some other media advertising conference that was happening in New York at that time, I remember. So we had a lot of marketing people, and I was very impressed with you and your presentation. We're connected on LinkedIn anyway. But then I saw you recently, within the last few months, quoted in an article in DigiDay about ad tech and privacy, and I thought, oh, you should have her on the show. I'm happy to have you here. I would love for you to introduce yourself. Just tell the trajectory of your career. How did you get to where you are now?


Arielle Garcia  01:30

Yeah, sure. So, I have an unusual trajectory. About a little over 10 years ago now, I joined a global media agency called UM as an administrative assistant. My plan was to go to law school at night and then go change the world or something. Back when I started law school, this was just before GDPR was about to take effect; I was growing in my career back at the media agency, and so on and so forth. It was an E-discovery class that I took; I was an evening student. So, it was my third year out of four. It was an eDiscovery class and my eDiscovery professor was talking about how disruptive GDPR was going to be to his line of work. Then I would go to work and no one who was really talking about GDPR stateside in the ad industry yet. Whatever conversations there were, we're like, yeah, we're capitalists, so we'll be fine over here. I'm sure it's nothing. I realized that there was an opportunity here for me to learn everything I could about this because, inevitably, yes, it would be relevant and important to understand privacy and advertising. That led to me ultimately leading CCPA readiness efforts, which turned into me becoming Chief Privacy Officer. I then, late last year, in September of last year, very loudly resigned from my role as Chief Privacy Officer because my concept of what was so exciting about the advertising industry and being a privacy practitioner within it is, hey, we're sitting between billions of dollars of the ad budget and platforms that control the information economy. Seems like a pretty good chance to make a change and to demonstrate how being responsible stewards of customer data and things like that translate into business impact. What I found was just the incentive structures were so broken that there's paralysis, so I left in order to break free from that entropy. I founded ASG Solutions to really help marketers achieve sustainable growth through more respectful and responsible advertising and data use.


Debbie Reynolds  03:44

Well, that's an admirable goal. So I'm glad you broke free, and I'm happy to see you being quoted in articles and magazines; you definitely have struck a chord, especially with me, as my space is emerging technology and privacy. Tell me a little bit about it; I love what you said about the brokenness of this incentive system.


Arielle Garcia  04:06

Yeah.


Debbie Reynolds  04:06

How do we get out of this brokenness, I guess?


Oh, So before we started recording, you were telling me about this case about medical transcription using AI and voice recorders. So I told you that I have an experience where I have this high-tech doctor, they use all these wacky AI things, including voice recorders, which I personally like, because instead of the doctor trying to type stuff, they can actually look at you and talk while it's recording. But I'm always concerned about who else listens to the recording and what else is happening with this recording. There was a case that came up that you wanted to chat about. Talk a little bit about some of those types of services.


Arielle Garcia  05:10

Yep. So a few weeks back, news broke about Publicis, which is another global agency holding company, Publicis's role in the opioid epidemic. So, part of the settlement related to work they did for Purdue that involved recruiting doctors to record doctor-patient conversations. What they did with that doctor-patient dialogue data was they farmed it for insights into what made patients reluctant to go on Oxycontin, for example. What they use that data for is to inform doctor education efforts to overcome those patient anxieties. They recommended I don't know if they actually went through with it, but they recommended approaches for proactive patient education as well, which, according to the Massachusetts complaint, included convincing people that it was safe to take opioids in higher and higher doses. So I think to your point, I actually wrote an op-ed about this because the other thing that happened is after the settlement news broke, an employee of a Publicis unit called Performics posted on LinkedIn, her name's Emily Duchamp, about how she lost both of her parents to opioids and she didn't know anything about these complaints against Publicis until the settlement came out. She essentially asked them, what do you say to employees like me? I actually wrote an op-ed about this., and one of the things that I said in it was that I'm not saying that this data can't be used for positive outcomes; I'm saying that the industry does not have the incentive structures in place to limit itself in that way. So I think with the example you give of your doctor, it's exactly that there are perfectly legitimate ways for this data to be used when you're providing consent, I would argue that there's no legitimate need that outweighs the risk for this data to be usable for advertising. The industry is not responsible or accountable or anything enough for that.


Debbie Reynolds  10:40

In those types of situations, when I think about this, that even though the story broke about, like you say because data was being used for advertising, which it shouldn't be, I find when you dig deeper, just a lot of things happen wrong, it's like a chain of events that happened, that should not have happened. First of all, just having the pharma company be involved in those patient-doctor conversations because. obviously, people were talking about things other than opioids and different things like that. So to me, it's almost like you fall from a tree and hit every branch on the way down. So I think being able to find out that root cause is really important, and I think it's probably another reason why we're seeing a lot of regulations around third-party data transfer, and also that purpose of data use. What do you think?


Arielle Garcia  11:35

That's exactly it. Because so much of the conversation from an ad tech industry perspective is around how do we get as cute as possible around what constitutes PII and what's de-identified and all of that, to the extent that the laws are dealing with that, we just see a ton of loopholes codified and a lot of workarounds, it's really something that I'm very interested to see is how the CPA is going to go about enforcing its data minimization and purpose limitation provisions and bringing that notion here of data being used in ways that are compatible with the purpose for which it was collected. I'm very interested to see how that will play out; we see the explosion of retail media. This is where I've always thought that transparency and being responsible with people's data is a business imperative because one could imagine that we can figure out a way to explain to people how their purchase data is used to enable advertising beyond the walls of the retailer's website, let's say, in the absence of that, however, and without restraint, because I have questions also about sensitive data and sensitive purchases and things like that, that I'm, as an aside, also interested in. In the absence of actually figuring out what that way is to convey the value exchange, I think we're going to end up in the same exact spot that we're in now. So I'm very interested to see how those purpose limitation provisions play out in the context of a lot of the ad tech developments that we're seeing on a similar note, all of the alternative identifier solutions. Again, it depends on what the notice looks like. So, with UID 2.0, it would appear to be in everyone's interest if there's one standardized way to explain to people what they're opting into. But we all know that when you actually explain this to people and give them a real, meaningful choice, some people choose no. So, in the process of hanging on desperately to scale, we're going to end up sacrificing the durability of those approaches. So, I'm very keen to see how these concepts play out.


Debbie Reynolds  13:53

Yeah, one thing that I'm really interested in, and the US hasn't yet, I've not seen anyone try to articulate it anywhere, codify it, or any type of law or regulation. But we have a problem where there are these legalese 80-page notices and policies that people don't read, don't have time to read that stuff. But it's sort of, wink, wink, we're being transparent, but the things that people really want to know about are buried in 80 pages of legalese and tech. So what do you think about that, like creating that transparency through better explainability, I guess?


Arielle Garcia  14:32

Yeah, look, I think that that is a piece of the puzzle. I also think that the reality is there are just certain risks that people won't be able to comprehend. An example of that is how ad tech-related data is used to enable warrantless surveillance. I know the FTC gently suggested that that should be included in one's privacy policy. I think the bigger question here is, what are the substantive limits that need to accompany better explainability? For this all to work, I don't think that it's all one thing or another. I don't think that the notice and choice regime solves some of the harms that are just not easily perceived by basically anyone.


Debbie Reynolds  15:22

I agree with that; yeah, it's definitely tough. There are a lot of issues. So I'm happy to see people like you, they're really chipping away at this mountain, basically, of issues and trying to make sense of it all. What is happening in the world right now that you see in tech, or data that is concerning most, or in privacy?


Arielle Garcia  15:44

We touched on this a little bit with the Verilog discussion that we had. But obviously, post-Dobbs, there has been heightened emphasis on the use of sensitive data and the collection of sensitive data in advertising. And obviously, the FTC has done a lot of work recently around sensitive location data. The Avast order was particularly interesting because and the blog that was posted yesterday, I know it won't be yesterday by the time this airs, but the blog that browsing data is, per se, sensitive, full stop. So I'm very encouraged to see some progress being made on that front. Because when you talk about privacy, there's a tendency to start with the regulations. I always think about these data policy issues as being at the intersection of privacy, consumer protection, competition, and a very human level; I'm particularly worried about the uses of data that result in the greatest potential harm, especially to the vulnerable. So, for me, sensitive data is absolutely one of those areas. But I am really encouraged to see some progress start to be made around that. I'm also excited to see what happens with Washington My health My data Act, I think that that's going to be fascinating to watch the industry respond. I know how this all works, I feel like it's going to take quite some time before anything meaningful happens. But at least having something to codify some type of protection around health data and sensitive data more broadly, is something that is encouraging to see.


Debbie Reynolds  17:29

Oh, I love your thoughts. You mentioned that. So I want to go there about the link between competition and privacy. I had a spirited debate many years ago with someone who's in antitrust, and they didn't think that it had anything to do with privacy. There was no linkage there. I just don't agree with that. Obviously, you don't either. So, what are your thoughts?


Arielle Garcia  17:51

There's no way to build a healthy market without getting those two right. Where I get frustrated, though, is I was just talking to someone else about this. Two things can be true at the same time, I can think that Google is absolutely weaponizing privacy to build a future that aligns with their commercial interests. I can also see that alternative identifiers are heading down the same path that got us here in the first place. Those two things can be true at the same time. The second point people raise then is, well, what's the solution? Okay, obviously, I would say the interesting thing to think about and to project is I would be shocked if any action is taken against alternative identifiers before action is taken to curtail Google's dominance. From an order of operations perspective, I would be surprised if alternative identifiers precede something being done to Google. But they're absolutely interconnected. They can't be pulled apart, and only if they're solved for together, intentionally, and in coordination, do you get to a place where we have a healthy market where both businesses and consumers have meaningful autonomy and meaningful choice. Today, we're called beholden to the platforms.


Debbie Reynolds  19:12

What is happening in the US in terms of either regulation or activities that you're seeing, in addition to the My Health My Data law that you think is a good step forward, a good step ahead in privacy?


Arielle Garcia  19:26

I think that the work that the FTC has been doing is the most impactful and the most fascinating to watch. One of the things I've been prioritizing in a lot of the writing and posting that I do is amplifying some of those orders and, again, the blogs, the insights that they're sharing because, quite honestly, the industry narrative likes to drown out a lot of that stuff. It's actually really fascinating. I posted something yesterday about the business blog post that apparently was surprising. There were others in the industry that found it worth reposting like it was a big deal. They didn't recognize that this dialogue was happening. To me, it's obvious. I've been watching this since GoodRX, and then you saw X Mode. I've been watching this happen. But you realize that when at the same time you have Industry Association conferences, and so on and so forth, all pretending everything's absolutely fine and dandy, nothing's going to change, you realize that the narrative is dominated by one view that is not necessarily representative of the reality here. So, I think that the FTC's work is great. I also appreciate that I think back to the GoodRX order. The fact that analysis went into customer match and sharing data via pixel and the difference between data shared via standard event parameters or custom event parameters. It's something that I think would surprise a lot of people in the industry that are not tracking this stuff because, again, the prevailing industry narrative is regulators have no idea how any of this stuff works. I think that's so clearly not true. Then the second thing I would say that I am excited about is, I think it's fascinating the enforcement decisions, the first ones that California has taken, if you take a step back to the beginning of this conversation, you asked how some of this changes, and you make the reality unavoidable to marketers, that seems to be what California is doing, which is wonderful. Because yes, you can absolutely go after the roots of the problem; you can try and take action against the platforms that have all of the resources in the world and to whom a lot of this is a cost of doing business. You can go after dodgy intermediaries; you can do all of that. Or you can go after where the money starts. And that's another interesting path to force downstream accountability. So I think that's kind of fascinating, and I'm curious to see whether that's an intentional strategy that's going to continue or a happy accident, because with Sephora, for example, they double down on not providing an opt out, you know?


Debbie Reynolds  22:10

Yeah, Sephora's, interesting. People who had actually read the law will know that they should have been doing that stuff anyway. So when the Sephora settlement came out, the clutching of pearls and rending of garments, it was just crazy. Just get with the program, folks; you're not supposed to do this; you're supposed to give people choices; you're supposed to give them transparency, especially a big company like that, who should know better. A lot of smart companies, there are concerns, but these bigger companies, the reason why they go after those bigger companies is because they want to set an example of what you definitely should not be doing with people's data. Tell me a little bit about how do you think just this AI gold rush is changing privacy, whether it's making it more visible, definitely making it more challenging? What do you think?


Arielle Garcia  22:58

Yeah, I agree with that, in the context of ad tech, especially, I worry that AI provides a new cloak for things to hide behind. So one of the things I'm watching play out, we also come full circle back to competition; one of the things we were talking about is Google's privacy sandbox. So here's what I'm seeing: I'm seeing on the one hand, they have this privacy sandbox solution that is incredibly confusing and incredibly expensive for companies to integrate with for what may well be pretty lackluster results. Then, on the other hand, they have Performance Max, which is a don't worry about it, we got this product, it's going to be very difficult to see how they're benefiting from data from across their properties, it's going to be very difficult to hold companies with these types of products accountable for discriminatory outcomes. For example, this has always been challenging, even when we can see the targeting parameters, let alone when it's a don't worry, we got this, and you multiply that by the algorithm type of approach, the typical refusal to accept accountability. So, for so many reasons, I worry about this. I also worry that, again, industry challenges, AI is the buzzword. It's a hot topic that everyone's excited about, and unfortunately, what I see happening with products like Performance Max and products like Advantage Plus is conditioning marketers to relinquish greater control and relinquish even more transparency. That's just not a behavioral change that is going to yield good outcomes for basically anybody except for the platforms.


Debbie Reynolds  24:43

Yeah, I was really dismayed by the way a lot of this cookie litigation went in terms of people saying, hey, we need to get rid of cookies, and it's like, well, cookies are a vehicle for stuff that people want to do that they shouldn't be doing. So I would much rather have people focus on third-party data sharing without someone's knowledge or consent, rather than trying to fixate on a cookie. Because I think if I compare a cookie to what's happening in the sandbox, or Google with topics, I would much rather have a cookie than someone downloading all my browser history and then analyzing it, you know what I'm saying?


Arielle Garcia  25:29

I honestly think it's the same. The difference is that before, everyone could do it, and now only Google can do it. So yes, I guess, if we forget the market effects and the fact that it is what's empowering this market for alternative identifiers, we can put that aside for a second. But yes, maybe incremental privacy gains; if you're comfortable with Google being the keeper of all of the data, there is no net change other than that this data is now solely in their possession. Then like I said, you take that and you pair it with, in reality, from a macro perspective in the market, you have alternative IDs, you have fingerprinting, that's going to happen in the absence of it. So I agree with you at the end of the day, I don't know that we're getting to a place that's better at all.


Debbie Reynolds  26:15

Yeah, very concerning. I wanted your thoughts about technology in the FemTech area. So especially post-Dobbs, there's been a lot of concern about these devices that are collecting data, for example, like ovulation devices, or just people tracking the types of apps that people have on their phone and making inferences there about those people. What are your thoughts about that in terms of privacy?


Arielle Garcia  26:41

For FemTech, specifically? So my perspective on this is, this is where you get back to the doctor's office example where you say you actually like that they have this stuff; there is a legitimate reason for data to be collected when it's used to deliver value to a person where that person wanted that value, and understands that their data is used to provide it. That said, I think that it's an opportunity for FemTech companies to go above and beyond, given the sensitivity of their core audience, to really make sure that they're safeguarding that data. Hopefully, it's table stakes by now, I know, I'm not naive enough to think that it is. But let's assume that it's table stakes by now that we're not going to share that data for advertising purposes; we're not gonna let random SDKs be integrated into our app. So, let's assume that all of that is handled; there's still the reality that if this data exists, it can be subpoenaed, and there are still risks that exist, even if it's used for legitimate purposes. One of the things I saw that was interesting, this was a couple of years back now, one of these companies basically developed a failsafe tool, where I believe the way it worked was like you entered some code, and it would wipe your data and present falsified data if your device was obtained. So there's the local storage element, then there's also, remotely, you can have that data be wiped and replaced with bunk data. So I think innovation like that is interesting to see. The FemTech space is ripe for it.


Debbie Reynolds  28:10

So if it were the world, according to you, Arielle, and we did everything you said, what would be your wish for privacy anywhere in the world, whether that be human behavior, regulation, or technology? What are your thoughts?


Arielle Garcia  28:25

Yeah, I think the single greatest change would be to ban the surveillance advertising business model. There are a lot of downstream effects on human rights, civil rights, autonomy, civic integrity, and democracy. At the heart of all of them is this business model that incentivizes extraction and unchecked, sale, use, and sharing, whatever word you want to use to describe the making available of data to any party. At the same time, there also isn't regulation requiring transparency around parties to which data sold or any type of know your customer diligence or anything like that. While you could imagine a regulatory regime that can mitigate most of the harm, when you get back to the fact that marketers are not benefiting from this, they're just not. It looks like they are; it may feel like they are, but there's not a lot there to support it. That's where I say that the most impactful change would be to ban the surveillance advertising business model. I think that there's legitimate ways to use data and advertising when again, first-party use cases. If I have a legitimate relationship with a company, with a publisher, where I receive actual value in exchange for interacting with them, and they use my data to serve me, there are legitimate ways for data to be used. But this third-party tracking-based business model is, at this point, harming marketers, harming customers, and harming publishers at the expense of our collective future.


Debbie Reynolds  30:03

I agree with that wholeheartedly. I think the farther that a company gets away from the benefit of a human, that's when you get into trouble. So if you can find a way to make what you're doing beneficial to the person, they may say, okay, I'll agree with that. I'll do that. But so much of what you're talking about doesn't benefit the person at all, and that's the problem.


Arielle Garcia  30:27

I agree.


Debbie Reynolds  30:28

Yeah, well, so great to have you on the show. Thank you so much for being on here. I'll be watching your LinkedIn and looking at your Op-Eds and stuff, and happy to see you and I love the work that you're doing. So thank you so much for that.


Arielle Garcia  30:41

Thank you so much.


Debbie Reynolds  30:43

Talk to you soon. Thank you.