"The Data Diva" Talks Privacy Podcast

The Data Diva E71 - Nishant Bhajaria and Debbie Reynolds

March 15, 2022 Season 2 Episode 71
"The Data Diva" Talks Privacy Podcast
The Data Diva E71 - Nishant Bhajaria and Debbie Reynolds
Show Notes Transcript

Debbie Reynolds, “The Data Diva,” talks to Nishant Bhajaria, Director of Privacy Engineering at  Uber, and Author of the book titled Data Privacy: A runbook for engineers. We discuss his operational focus on Data Privacy, the large gap between operational and legal Data Privacy and the need for collaboration between them, the shift from precedent to current proactive policy, successful tech companies and collaboration, the challenge of data deletion, the data life cycle and a new emphasis on purpose, his greatest concerns in Data Privacy, trust and data quality, regulation verses the understanding how technology works, focus on potential harm rather than technology, Data Privacy focused technology, helping organizations ask the right about data privacy, great career advice for technology people who want to enter the Data Privacy field, and his hope for Data Privacy in the future.

"Nishant Bhajaria is the Director of Privacy Engineering at Uber; he has led teams of engineers, architects and data scientists to help operationalize privacy programs at Google, Netflix and Nike. He is also the author of the recently published book Data Privacy: a runbook for engineers and teaches courses on cybersecurity, career development and inclusive leadership on LinkedIn."

NOTE: Manning Publications has offered a generous discount to “The Data Diva” Talks Privacy Podcast listeners to purchase Nishant Bhajaria’s book called “Data Privacy: A Runbook for Engineers.” Please see the podcast transcript for details on how to take advantage of the discount to get the book.

Support the Show.

35% Data Diva Discount from Manning Publications

“The Data Diva” Talks Privacy Podcast in collaboration with Manning Publications is thrilled to present a special offer to our podcast listeners.  Listeners who want to purchase the book “Data Privacy: A runbook for engineers” by Nishant Bhajaria,  please refer to this link to purchase this book using the following code for a 35% discount: poddatadiva22                                 
Listeners can refer to this link for Nishant Bhajaria’s book: http://mng.bz/zQNQ

Also, listeners who want to purchase other books from Manning Publications can use  The Data Diva’s permanent 35% discount code (good for all our products in all formats) using the following code for a 35% discount: poddatadiva22

You can refer to this link for all Manning Publication Titles: http://mng.bz/GGJO

 If you have any problems with redeeming the coupon codes, please contact Ana Romac at anro@manning.com

50:25

SUMMARY KEYWORDS

privacy, data, company, people, engineers, legal, deletion, tools, customer, conversation, laws, engineering, cookies, big, product, understand, means, build, pretty, team

SPEAKERS

Debbie Reynolds, Nishant Bhajaria


Debbie Reynolds

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.



Debbie Reynolds  00:00

Hello, my name is Debbie Reynolds; they call me "The Data Diva,” and this is "The Data Diva Talks" Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world and information that businesses need to know now. I have a very special guest on the show, Nishant Bhajaria. He is the head of privacy engineering for Uber. Nice to have you here. So you actually, this is a very timely podcast. So you have a book that just came out, called Data Privacy, A Run Book For Engineers. It is on Amazon, and I hear it's selling very well. Just got released. Congratulations.


Nishant Bhajaria  00:57

Thank you, people, who care about this topic. That's what that tells me.


Debbie Reynolds  01:01

Oh, absolutely. I was really excited. You and I, we were connected on LinkedIn already. You reached out, and I thought it would be a great thing for us to do this podcast together. To me, you're like the perfect guest. So you're the person I like to talk to because I like to talk about the gaps in privacy. And I feel like you fill that gap. Right. So I think when especially when GDPR came out. So many people started focusing on privacy as more of a legal issue, right? And so we know that there's legal and there's operational things that have to happen in privacy. So there wasn't as much attention on the operations. But I feel like you fit into that middle space where you're trying to connect the legal part of it to the operational part. So the fact that you're doing privacy engineering to me is fascinating. I also want to say; This is so cool. You and I have a friend and common. Olga Kislinska. She was at Nike, and I think you were at Nike with her as well. She's been on the podcast. I'm a great fan of hers. And I'm glad that she's found her way over to Uber with you.


Nishant Bhajaria  02:25

Yeah, she's part of my team. That's fantastic. We've known each other for like 10 years now.


Debbie Reynolds  02:29

Yeah, I love her. She's amazing. So tell me a little bit about your journey in privacy. Let's start there.


Nishant Bhajaria  02:38

Thank you so much for that. So yeah, my career began as an engineer back in the day. So I have a Bachelor's and a Master's in Computer Science. And even during my college and grad school days, I was one of those eclectic engineers; I like to code. But also like the policy world, I was also writing columns for the college newspaper, I was in the debate team, so I was kind of all over the place, even back then. So that's kind of how my career began. And I was an engineer at Intel and WebMD. So if you used WebMD software back in the day that lets you search for a doctor, a lot of companies bought software that lets you search for medicines that will help you. A lot of that code was mine. That's kind of my claim to fame, and I can point to that as something built that people can use. And then, somewhere after the 2010 year ended, the world started shifting where the concept of data transfers became very real. So essentially, when I was at WebMD, customers often sent us a list of their employees, you wanted a gift card, and they would get that gift card of definition, health care, physical, for example. So I would process all of those files at the backend. And I would find increasingly, back then, people didn't always know how to protect their data, right? I would get spreadsheets with information unencrypted without password protection on email. So I started asking questions like, hey, how should we protect this better? So even before Privacy and Security Engineering became disciplines, I was the guy asking questions which people didn't always know the answer to. If nobody knows the answer, maybe I should create the answer myself. So that's how I got into privacy. It was a very informal introduction. And then, by 2013, 2012, the discipline truly took off with the Nike scale; it was kind of a great opportunity. I have great memories of teams from back then. And then, in 2016, I moved to the Bay Area to work for Netflix. And that was pretty much where the career trajectory just took off. And a lot of my work now in teams at Uber and before has been basically building out organizations either within my team or across the company that connects legal to policy to data analysis and machine learning, from different geographies to engineering. So my teams build tools, just like the book describes; how do you make sure that privacy is handled correctly in a central team so that the engineers across the company whose job it is to build products that make the company money can do that? So essentially, how do you make sure that privacy becomes a force multiplier and a differentiator without slowing down the company and without hurting engineering morale? So that's kind of my sweet spot. That's how I got into the space.


Debbie Reynolds  04:53

Yeah, I love that story. And I would love to see more companies fill this gap in that way. Because it's definitely a huge gap, actually. So I think working with legal a lot of times what I see some companies do. And this is not always the way; I feel like when you're working at a company whose product is around technology, you have a different focus than other companies. So some companies, I say, Well, let's start with the legal part, right. So let's get like forms and policies and procedures in place. And then let's see how the technical part can adapt to that, where I feel like if you have a product that is kind of a data-driven product, you have to kind of start with kind of what you're doing, and then try to build those policies around that. What are your thoughts?


Nishant Bhajaria  05:47

Absolutely. And you make a great point. And this is a very insightful question, even more than people often realize. So yes, historically, privacy has been thought of as a legal concept, for that matter. Historically, security has been thought of as an incident response concept, only the human brain is known to sort of find one thing that is like the other, and we try to create patterns that way. But really, security is about encrypting data. It's about reducing the size of data, and it's about finding patterns, doing gap analysis, doing fraud analysis, things like that, right? Securities are very proactive. Privacy is the same way; it is a lot more proactive than people realize. And we are going to have you sort of imagine a visual here, think of a funnel with an arrow on the left-hand side, and the data comes in on the narrow end. Okay, that's kind of the company's edge API layer. When you first open the Netflix website, when you open the Uber app, data passes through the funnel from left to right. And in my funnel, the size grows on the right as you move from left to right; what actually happens is that the engineers across the company make copies of that data. They do creative things with that data, and they share that data with third parties; they get data from other locations and merge it with that data. And as a result, when you go from left to right, the funding grows. And what happens is typically, in most companies I've seen, the legal team enters on the right-hand side of the funnel, and they have to make the very difficult decision on the right-hand side with the volume of data. That is, it's exceedingly large, and they have to either approve something before it ships or block it. And nobody wants to be in that difficult situation. Because by that point, a lot of engineering hours have been sunk, a lot of money has been spent. And then if you tell engineers, oh, my goodness, you can't ship it, it leads to some very unpleasant conversations, right. So when my teams historically have worked with legal, what we typically do is we talk to engineers at the left-hand side of the funnel before the legal team enters the conversation. And we tell them, here's what you should and shouldn't do, maybe don't make a copy of this data. Some other engineer already has a copy, don't share data with this vendor because their practices are not very solid or throttle your API. So we address a lot of the risks at the outset. And we let the legal team know. Hey, by the way, there is this design coming your way; we've addressed these three risks, but the fourth one is a legal one, and you got to make sure that you take care of it. What that means is a lot of the suspense is taken out of the mix; we've taken a lot of the risk out of the process, in partnership with engineering, in partnership with the legal team. Before the formal privacy impact assessment begins, this exceedingly makes it easier for the engineers to work with privacy and not see us as a blocker. It also means that the legal team does not have to do as many PIA's as historically they have done. You know, the thing is, if your engineering team grows by 2x and 3x when your market share goes by 2x and 3x. The legal team cannot automatically grow to x and 3x; it just cannot happen because there isn't simply the legal talent available in the market right now. Plus, attorneys are expensive, just like I am. So what my team does is we work with legal we work with engineers across the board to make the process smoother, less bureaucratic, and less burdensome for the company across the board. And what I have historically noticed is that the pace of approvals goes up, the number of ERDs or get blocked goes down. And the relationship between these different teams is much smoother as a process. So don't let anyone tell you that privacy slows you down; it can actually speed you up a lot more.


Debbie Reynolds  08:50

Yeah. Yeah, exactly. So I think that I love what you're talking about. So I think when you're in a data-driven company, they have to work that way. Other than that, you know, you will be able to put out products, you will be put out updates with the speed you definitely needed to. But I think also the shift that needs to happen. And I would love your thoughts on this. You touched on it a bit that some people think, let's think about legal. So a lot of things on legal are based on precedent, right? Laws, regulations, and a lot of times, you're working with things for which maybe there aren't even regulations yet, right. So you have to kind of come up with your framework for what you think would align with maybe even a future regulation. But then also, you know, you can't really do privacy well if you think of it as a reactive thing. So it's people and legal if they were trained that way, right? This is kind of a reactive as opposed to a proactive thing. That can be a problem in the mindset, but then also, you know, it has to be baked in. It can't be thrown later into the mix. We know that doesn't work, especially when you're putting out products. What are your thoughts?


Nishant Bhajaria  10:10

Absolutely. And I feel like the legal team has pretty challenging tasks in front of them and every company because you have this on sort of projects that need to get cleared for release. But you also have a regulatory landscape that is somewhat uncertain, to put it politely, right. And I feel like it is easy to blame the attorneys. But really, if the tech sector and businesses had done the right thing, like 3 or 4,5,6, years ago, we would not be in this situation where you have the uncertainty on the legal side of the regulatory side of the house, and people struggling to make up for it, right? So what I think needs to happen, and I've had a great deal of success with this in my own career, is that you work with the legal team, and you create an internal abstraction of what your privacy posture is going to be like and not just privacy, but I like to call it data protection, because privacy, security, trust, physical safety, they're all pretty close to each other. So what we have historically done in my career is that my teams worked with the legal team with the engineering team to come up with an internal abstraction. So how do we rank data based on risk? How do we create a workflow for privacy and security reviews? How do we create privacy controls and make them enforceable at scale across the organization at all stages of the data lifecycle, and that's what is known as privacy architecture. That's part of my title here and my current job right now. And when you build that infrastructure, your ability to react to legal regulatory changes goes up significantly. Because as you correctly mentioned, we have GDPR, we have CCPA; we are looking at a whole bunch of state laws across the country. And then that's before you even look at South America and South Asia. And it's going to be very hard for companies to scale at this level. You know, what's interesting, Debbie, is that when we had the discussion about GDPR, as an industry three, four years ago, the conversation was, hey, GDPR will put the big tech companies in line, and you know, GDPR is going to serve as a check. And my response to that is to check the stock price of some of these big tech companies when GDPR was passed and check their stock price today. Obviously, today's a bad day because the markets took a bit of a swing in the last couple of weeks. But in general, the stock prices for the big companies tech companies have gone up since GDPR. So the law is having an unintended outcome of basically hurting small companies without keeping in check some of the big ones, right. So what you really want to do is take the bull by the horns, come up with an internal abstraction to synthesize all of these different laws, and come up with an infrastructure that is very tool heavy, that gives you visibility across the company in terms of what you have, what you collected, who uses it for what purpose, and was that data processed correctly, right? If you have that conversation, your odds of being GDPR and CCPA compliant go up a lot, without making the whole exercise seem like very legal, very bureaucratic, very compliance checkboxes because you want to use compliance as a floor not a ceiling. And the argument here is that if you do all those things correctly, you will improve the quality of your data, you will shrink the size of unneeded data, you will spend less on security, you will spend less on deletion, and you will spend less on cleaning up the mess after the fact. So I would use regulations as a floor, not a ceiling, in that conversation.


Debbie Reynolds  13:03

Yeah, that's fascinating. I agree with that. So tell me a little bit about collaboration. So a lot of companies, companies that are very successful, in my view in technology, know how to do collaboration, right? So in traditional business, everyone is in silos, right? So they only work with each other when they have to right, but I think privacy necessitates breaking down those silos and being able to communicate across those silos. So tell me a little bit about your executive experience in breaking down those walls.


Nishant Bhajaria  13:45

And again, that's such an insightful question, Debbie. So historically, you know, when I was an engineer back in the day, 15 years ago, the silo concept didn't exist; everything was very interconnected. Like I had to get my instructions from the top down. So, my managers, OKRs became my OKRs. His manager’s OKRs became his OKR, so it was all very, very sequential; it was all very, very top-down. And that made life very simple could also make life very predictable. Like it was very hard to be innovative, it was very hard to get a product out the door quickly, those days when you had this six-month waterfall cycle, where product management would come up with requirements, hand them off to engineering, and then they would churn stuff out six months later, when in fact, the whole market had changed. So in order to address that, you came up with a more agile siloed function where every team was given the autonomy to ship stuff based on their market and their innovations. Right. And you had this ability where teams could have their own tech stack, and they could have their own deployment pipeline. And that made engineering much simpler, and to be very honest, that enabled companies to churn out unique products that were to the benefit of the customers as well. The downside of that approach is that now when companies have to do something truly cross-functional and interdependent, that muscle has atrophied because everybody was in it to get their products out the door. So building out the abstraction, like I mentioned before, having the privacy engineering team builds central tools that apply across the platform without slowing down these engineers is the right way to do business simply because it enables you to keep the silos that are truly productive. But it builds this connective tissue where people actually start understanding each other a ton more. You know, one of the earliest lessons I learned as a privacy engineer is that a lot of engineers wanted to delete data that they didn't need. But they were also in such a rush to get stuff out the door that they always assumed that somebody else will take care of the dirty work, right? The assumption was that somebody else was watching the store behind the scenes. And what we have learned in the last two years with the COVID situation, for example, and I tell this to new engineers all the time, is that there is no smart person stable behind the scenes to take care of you; you have to take this responsibility to make sure that your tools are compatible, and the privacy engineering team whose job it is to make sure that that compatibility exists. So my job and my team's job is part technical, to build these tools. But the other part is sort of working like the team's Chief Privacy officers across the board to make sure that they can work with each other in a very synchronous fashion, without assuming that somebody else is to blame, or somebody else will save the day.


Debbie Reynolds  16:03

Yeah, and let us talk about like data deletion. So the data lifecycle, so there's a lot of fanfare, right? A lot of attention whenever there's kind of a new product or a new rollout of something, right. But at the end of that, let's say a tool or technique has kind of aged out; the data isn't as valuable to the company as it once was. And then a lot of companies trip up here because they retain data for so long. And then maybe it's no one's, quote, unquote, job to do that, right? To be able to get rid of data. And now we're seeing with privacy regulations, a lot more impetus or a lot more things around tying data to purpose. So once data breaches is the end of this kind of life cycle, right? These privacy regulations are saying you should, you know, not keep data longer than necessary and get rid of data. And that's not as easy as saying, you know, deleted every seven years, right? So talk to me back how that back end thing no one wants to talk about.


Nishant Bhajaria  17:16

Yes, deletion is one of those fantastic concepts that's very hard to do. And you would think, from a data destruction perspective, you locate the data, and you delete it, right? It's like, it sounds very simple. And like most things, that sounds too simple to be true. This one definitely is not that simple. So what I would say is, first up, let's get our definitions correctly. Data deletion is something you know, people often use words like deletion, compliance, security, and they don't always understand what it means. So deletion has several layers to it; you will either destroy data completely as if you never had it to begin with. And that is very difficult to do. Because once data enters the system, there are copies of that data, there is the information we will infer about users based on data that was then deleted, but that derived information still remains in the system, right? So deletion is just very hard to do. Because deleting data is like finding all the breadcrumbs. And as anybody who's gotten lost in the jungle will tell you, those breadcrumbs are hard to find. That's number one. So having a definition for what deletion means is pretty critical. And that is, again, I go back to the abstraction that I talked about with the legal team and the engineering team; let's define what data looks like from a deletion perspective. The second thing I would say is having data governance in place at the early stage of the funnel; on the left-hand side is a shift left terminology as people often use in the industry is pretty critical. Because what you can do at that level is intercept data while the quantity is relatively smaller, and you can tag that data. So, for example, if you intercept my email addresses, and majora@gmail.com, you can recognize based on the pattern of the email address that this is an email; it might be sensitive. So you can tag it by saying this is, for example, a P Zero Priority Zero level of data slash email. So now, any engineer that uses that field down the road will know that this field is something that's very sensitive. And you know why that's important, because a lot of this data that companies collect is unstructured data, so it does not come in with a traditional label to it. The reason having that tag is beneficial, Debbie is that when you need to delete that data afterward, you can locate specifically the aspect of that data set. So you could have email, home address, IP address, you can have an app, you know, the first name, last name. And if all you need to delete is the email, you can execute your algorithm in a very targeted fashion by looking at that entire blob of data and delete just the email. So intercepting that data early coming up with that tag makes subsequent deletion much easier. Because the reason a lot of deletion algorithms fail is because companies try to delete a whole ton of data in one go. And that's very hard to do because deleting data takes computer power; it takes memory, it takes algorithms, it takes queries. That stuff is very hard to do because it slows down the engineers because remember, what you're doing in that moment is you are not just deleting data; you are competing for bandwidth with other engineers across the company who need to use the same data or other parts of that data. And that is how privacy and the rest of the engineering organization often get in trouble because they look at each other as adversaries. The third thing I'll mention is that deletion has other options, too; you don't always have to delete data completely. You could disconnect sensitive data from other non-sensitive data, which means you can use that sensitive data in a very targeted fashion by managing access to it. While the rest of the data can be on a more permissive scale available across the company, you're going to anonymize that data. You could offer to say that data somehow, you could aggregate that data, so there is a lot to it. So having that partnership between my team, the engineering teams, and the legal team is pretty critical because that is the only way you can execute deletion in a meaningful trackable defensible fashion.


Debbie Reynolds  20:34

Excellent. So tell me what's happening in privacy anywhere, whether it's regulation or tech that concerns you most right now. What's on the horizon of concern to you.


Nishant Bhajaria  20:46

So I wouldn't say I'm concerned; I'm intrigued because I don't always know how things are going to play out. And when you have a role like mine, and when you have a profile like mine in the industry, you kind of have to think ahead a little bit, and you can be very reactive. And you can always come up with these plans at short notice either. So there are two dynamics that are pretty critical. As far as I'm concerned, there's a lot of conversation in state legislatures at the federal level as well around what privacy regulation should look like. Right. And this is where I'm just a smidge concerned to use your phrase because we are living in a world where Steve Bannon on one side and Bernie Sanders, on the other side, don't agree on a whole lot. But they agree that the tech industry and privacy need attention, right. And when you unite the political spectrum against you, that's a bit of a problem. Like I remember all those CEOs of big tobacco in the late 90s, in front of Congress, basically arguing that tobacco is not addictive, you know, I don't think big tech wants to be in that situation. So we have to make sure that we get our house in order before somebody who doesn't always understand what we do now has the power to regulate us, right. So either we do the right thing now, or somebody who doesn't like us will make us do the right thing in a way that doesn't really work out very well for us or the customers. That is something that is of concern to me. And I feel like just as we've been talking for 30 minutes now about the legal teams and the engineering teams working together, it is critical that industry and government work together to shape the conversation around privacy. And I want to make sure that rather than just a policy of the government relations teams, having that conversation, it would be awesome if the government representatives who talk about privacy talk to engineers as well, because historically, this conversation has left engineers away from the conversation. And that's part of why I wrote the book, and engineers need to have a seat at the table. So this book is going to serve as a bridge, hopefully, and I've gotten a ton of emails in the last three or four days. And that makes me feel pretty hopeful. The second dynamic that makes me feel even more intrigued is sort of this, the regulations that are not coming from governments but from companies, right, when you have Apple basically saying that you need direct consent from the users before you collect their IDFA under ad ID that is pretty intriguing. Because you actually have a company that controls a significant bridge between the audience that buys their iPhones and the ad publishers and providers. That is pretty interesting. Because when you have that consent layer in the middle, we don't always know how that's going to work out; that might lead to more consent because people now are giving you informed consent. And you will have better engagement better quality data. Or you might have some combination of people being reluctant and not giving you consent, and that will slow down the ad industry. So I'm not sure how that's going to fully play out because all of these conversations are also happening in the middle of this COVID pandemic, right. So what's going to happen when people aren't as dependent upon working from home? What's going to happen when the ad industry, the remote learning industry, the healthcare industry, the remote education industry? All these industries collect a lot of this data, and then the world changes post-COVID. So I'm intrigued about both of these dynamics because having one of these happen is a pretty big deal. Having both of these happen even more so. And then having both of these happen, why we are in the middle of a pandemic is something, unlike anything we've seen before. So that's kind of what I'm looking at pretty close to there being


Debbie Reynolds  23:44

Excellent. So let's talk about trust. So as we mentioned earlier, some people you alluded to, some people think about privacy almost as a tax, right? Oh my god, I was doing these cool things. And now I have to think about privacy, and we really can be an advantage. So the advantage I want to talk about, and trust is about data quality. Right? So not all data is good data. Right? So I don't believe you know, people say data is to the goal. You know, I don't know; I think insight is the new goal, right? So if you get good data, you can get great insights. And part of that, that data quality goes to trust. So individuals trust the organization; they will give them better data, more accurate data, right? Because they feel that the data that they're giving benefits them. So talk to me about kind of the advantage of trust in that regard.


Nishant Bhajaria  24:46

And that's a great question, Debbie. So in my early days at Intel, my boss used to tell me, don't confuse effort for the outcome. Like you might have spent 20 hours doing this, but is it producing outcomes that are truly useful to the customer right. So this is this same point with data and insight. People often confuse data and insight, and they're two fundamentally different things. Sometimes good data gets you good insight. Sometimes going through a kind of data gives you insights that were buried. But if you end up with data as being your only guideposts, you are basically setting yourself up for at least inefficiency and, at worst, failure. So what I tell people when I partner with them is that there is a vested interest in making sure that data quality is good. So all these people on the right-hand side of the funnel, the machine learning scientists, the data analysts, or data scientists, care about one thing they care about Business Insights; where should we invest more money? What's working? And what's not working? Did we do the right thing by going into this market? Or did we do the right thing by buying this company? The answer to those questions lives in the data. And even that data is changing every single day. So remember, the data is not this constant, that once you collect it, it's done. And you can query it for the end of time. So what we need to do is make sure that the quality of the data is great. So when you have the categorization happen at the left-hand side of the funnel, that means that by the time you get to the right-hand side, the funnel has shrunk a little bit. But that doesn't mean that you have less useful data. That just means that, in all likelihood, you have the quality data you need. And all the data that you didn't need has been taken care of, and dispute disposition doc. What that means is these engineers and these data, scientists have to query the data a lot less often in the past, they're to run queries again and again because some of those queries failed. Some of those queries produce junk outcomes. But then, when you end up on the right-hand side with a more manageable data set, it basically says a couple of things; it means we as a company are becoming more efficient, smarter, and better. It also means that the customers are reassured that we're not just holding on to their data because we can. We can, but we shouldn't. We're basically being very judicious. We're looking at what is the impact of this data collection going to be? How will the customer feel if they actually found out that we had this data? And can we truly make the claim that the use of this data is producing benefits to the customer? And I feel like all of that makes it possible to really show the customer that there is trust in place? If you remember the 2008 testimony when Sundar Pichai, the Google CEO, when before Capitol Hill, they asked him a ton of questions about dashboards, like how do you tell customers what they have what you have about them, right. And when a company pushes out a product that lets the customer make choices about privacy, that is collect this, but don't collect this, track me here, but don't track me there, that is a sign of trust, that means you can directly tell the customer that, hey, we are giving you a choice, and control over what we have to watch your data, right. And what that basically means is that you as a company have enough confidence in your back-end tools that you can now go before the customer and build that bond of trust. And this is why I said earlier that use compliance as a floor, not a ceiling. I don't think there is any compliance law in the world that makes you build these dashboards. This is the opportunity for the companies to build out these dashboards and these tools for the customers to build that relationship with the customer directly. Because you have two obligations here. One is to follow the compliance mantra and make sure that the legal audit arm of the company feels secure. But you also should talk to the customer directly and say, hey, we care about you so much that we're going above and beyond the pure compliance landmark. And we're shooting for trust because trust is at the end of the day, everything unless you trust people; every other conversation we're having is bunk because trust is literally the only thing that matters at the end of the day.


Debbie Reynolds  28:14

Yeah, yeah. Well, one thing that concerns me a lot. And I look very closely at court rulings and the political spectrum about conversations about privacy. And what we're seeing around the world is that governments want to do something, they want to do some type of regulation, and they don't necessarily know what to do. And then what's happening is that they're coming up with things that, to me, are untethered from the technological reality. So that concerns me a lot. And one good example that's going on right now is kind of all this conversation around cookies, which I think is insanity, but tell me your thoughts about about that. I'm concerned in that area.


Nishant Bhajaria  29:01

Yeah. So and I share your concern, Debbie. So if you look at the sort of people who typically run and just from a US perspective, that people who sort of run our country, if you look at the Presidency, if you look at the leadership in the House and the Senate, a lot of these folks, to their credit came of age in the 60s. This was when we did big things, the interstate highway, the moon landing, they liked the idea of scale, and they like the idea of passing huge laws that cover as many use cases as possible. And my friends in the regulatory space tell me that it's very hard to push legislation through, which means you only often have one bite at the apple. So the goal is to get as much covered as possible. The challenge is somewhere past the 1990s. The world changed. Size isn't everything. We are looking at small things that make a big difference. Malcolm Gladwell wrote the book called Tipping Point, about how small things eventually lead to big outcomes, right, smaller computer chips, smaller product releases, smaller cars, smaller homes. So we're now trying to do small things in a much more efficient way. Simply because getting these small things out quickly means you can get the feedback loop going with a customer a lot faster. Smaller micro teams across the company will ship out products a lot more efficiently without creating a lot of bureaucracy. So the big disconnect here is this expectation on the legal on the regulatory side of the industry, not just industry, but the country as a whole, that you can do these really big laws and that will produce trust for the customers. My approach is, why not look at this in smaller approaches? Why not look at data collection as its own arm? Why not look at deletion as its own arm? Why not look at data sharing as its own arm, like this whole movement that started in 2018. Cambridge Analytica was about a small company that was able to extract a whole ton of data and fundamentally change the conversation about customer protection, right. So my concern around cookies exists. My concern, especially around third-party cookies around consent, is definitely right there. But my bigger concern is the fact that people seem to have this expectation that there's going to be this huge privacy law that's going to fix all these issues. And to them, I say, look at immigration law, look at tax law, is that the model you want to replicate? I don't think so. Because I've been in these companies. I've been in this industry for about 15 years now. And I know how stuff works behind the scenes; a lot of these changes happen when small teams do things that have an impact across the board. And that either works very well, or it works very poorly. So it is pretty critical that just as the engineers who work in jobs like mine understand the law understand the policy, I think it's also important for the policymakers to understand how engineering actually works behind the scenes so that they can construct laws that are enforceable, be understandable, and see that are verifiable if you can do those three things. You can have all the laws you want. But the real value that is customer protection trust will not actually be accomplished.


Debbie Reynolds  31:36

Yeah, we don't have cookie laws, so to speak in the US, right, and the way that they have it in Europe, but it always disturbed me that people caught on to that concept. And there's an act of litigation over this stuff. Partially because I think the way to keep laws more modern in this way is that you talk about the potential harm to the individual. And I try to focus on the technology because we know that the thing is that people are upset that our download cookies can be done in many other ways. So to take down a cookie doesn't solve the problem, really. What are your thoughts?


Nishant Bhajaria  32:18

No, absolutely. And you made a fundamental point about ignoring that technology, right. And that's something, let me give you a specific example. For all the attention that is being paid to cookies, what people often forget is if I have some data about you in one data store here and if I have some other data about you and some other data store over there. And if I Google your name, and if I Google some information about you, I can find out information about you online. Now, each of these three individual pieces of data may not be very dispositive, right. But if I combine the three of them together, now I have something, I can find out where you live, where you do most of your movie streaming, where you do most of your shopping, it could entirely be that you do most of your shopping at work when you have like a 10-minute break. When you stream Netflix, it might be a home because that's where you have enough time and focused attention to watch a movie. And if I can identify sort of movement patterns between both of those IP locations, I can understand where you work and where you go for lunch or whom you meet with at a hotel. I can track you without ever knowing who you are. So for all the attention that is paid to cookies, it is really infrastructure, data, and cookies collectively that I think should be the focus and who accesses this information. For what purpose? What happens to the data? Where is it logged? How often is that log clean? It's a much broader conversation. And I feel like cookies have become what big data used to be five years ago, or cloud computing used to be eight years ago when these concepts were brand new. And I feel like the danger here is that people are going to focus on recency bias, right? Essentially, it's going to be cookies are the big thing we talk about. And by the time the law is passed, by the time the regulations are understood, we've moved on as an industry to something totally different, and the customer doesn't get protected. And when I talk about the customer repeatedly, I'm thinking about people like my dad; my dad uses his smartphone for two purposes. One is he sends me news on WhatsApp about things he thinks I don't know about. And I do know about them. But parents care, right? So they send you information every single day. And then, he uses the Uber app to order his Uber to go meet his friends at the club. Right? So that's all he really understands. And I'm thinking about how do we protect those customers; he doesn't know what cookies mean unless there's a really nice cookie on his table, and he will eat it. That's all he understands. So how do we protect those people in the conversation about cookies has been about basically tracking this very abstract form of data, but not about how do we make sure that the customer whose cookie it is gets protection and be the visibility they need? And that’s, for me, the big topic of concern here.


Debbie Reynolds  34:35

I love it. I love that you're talking about it in the same ways as I. Oh my goodness, it just frustrates me so, so much when I see this for sure. So tell me about kind of privacy focus technology now. So we're seeing a lot more technology coming into the marketplace that is trying to help with the heavy lifting that companies do with privacy. What is your thought about privacy, not necessarily privacy tech, but the sort of privacy sensibilities that you're seeing coming through on products?


Nishant Bhajaria  35:14

That is a fascinating area to get into. And we could do a topic of conversation on that topic alone, Debbie. But yeah, you are; you hit the nail on the head. Again, there's a lot of money thrown into this industry right now. I get a lot of phone calls. For with a whole host of options. Some people want to sell me their tools. Some people want me on their payroll, it's crazy, like this discipline that was so esoteric and so different, like eight years ago, I was, I remember being a lonely privacy engineer back then, to start there. And to end up in a situation where companies are valued at a billion dollars is pretty crazy. So I think the quick upshot here is that there's a lot of tools in the market. But people don't always understand whom they're building for. And people don't always understand whom they're buying from. That's, that's the disconnect here. Because you can build tools that are very, very focused on the engineering side of the house. These tools are optimized for data discovery, for data deletion, for data categorization, their inventory, those sorts of things. These are tools that are aimed at engineers, and they're aimed at people like me or my boss, who's the CEO of the company. And these tools have a lot of potential, but they also have a lot of challenges. Onboarding these tools takes a lot of time; remember, these tools will work to the extent that they're adopted across the company by engineers who collect data and ship stuff out the door, right. And these engineers are incentivized to get stuff out the door growing engagement. And that's how they get promoted. That's how the companies make money. So oftentimes, these tools find it hard to get traction because engineers at the company are really busy. So it's pretty critical. So if you have any VCs, any startup founders listening to this podcast, which I hope you do, make your tools as easy to onboard as possible because your audience, your customer, is not really me. I'm the one that signs the checks. But really more important that the engineers two or three or four levels below me, because they will be the ones who make the decisions around how quickly they adapt to you really, the parameter of power has been inverted. In this case, what often happens is these when these VCs and these founders, they send me all these emails, they send me bottles of wine, but really, you should be wining and dining, the engineers who have less power than me, because they'll be the ones who decide how you're doing gets you. So that's one thing. The second thing I mentioned is there's another set of tools that there is aimed at the legal side of the house, the chief privacy officers, the Office of General Counsel, and these when these stakeholders care about compliance. They care about legal audits; they care about the actual mapping of privacy controls within the company, to articles of GDPR, to articles in the CCPA, things like that. And what often happens is when companies buy these tools, they tend to buy them in a moment of crisis, they've either received a consent decree, or they've gotten a bad story in the press. And when you make decisions in a panic state of mind, you don't make the right decisions. It's why I don't go to Costco when I'm hungry, and I'll end up buying stuff that's not good. For me, anyway, what happens to companies is they end up going for these third-party vendors when they are upset when they are stressed. And they end up buying the wrong thing. So companies who need something like, say a OneTrust, a compliance focus to the end of buying BigID, which is a very engineering-focused tool, and companies that need, you know, BigID, they end up buying OneTrust, essentially, you end up buying a legal tool when you need the engineering tool, and vice versa. So my advice to the privacy tech community is for the vendors who build these tools, explain what you're solving for, explain who will use your stuff within the company. And I would dare you to go to the website of many of these vendors; they don't do a very good job of explaining exactly who they're solving for and whom they're selling to. At the same time, a lot of companies don't always do the due diligence and research and time; the time for companies to do the research about these products is when things are actually going well. President Kennedy had a saying that the time, the time to fix the roof is when the sun is shining, the time to understand the privacy tool market is when is before something bad happens within the companies. So there is some work that needs to happen on both of these sides to make sure that the right purchasing decision gets made. Because what happens is what you do not want to have happen in these cases, companies buy a good tool, and they end up feeling like the problem didn't get solved. And that creates a lot of frustration on both sides. And that is what I'm looking to avoid here.


Debbie Reynolds  39:04

I agree with that. And I know I talked to, you know, money folks, you know, VC, private equity folks that are looking to get into the space. And some of them get upset with me because they'll tell me well compare these two things. And like they're totally different. I can't compare them because they're not the same at all. You know, I think they hear privacy, and they put everything in one bucket, and it really doesn't help. And then another thing that I'm seeing is that companies have challenges asking the right questions, so they don't know the right tool that they need. And so they implement something that is not what they needed, and then they're back out in the market looking for something else or looking to you know, maybe they'll have two or three tools now, you know, so it's definitely a problem because I think you're right, a lot of these. It's hard for a regular consumer organization isn't really steeped here. Understands here to the even differentiate these tools for one another.


Nishant Bhajaria  40:04

Yeah. And you made such a great point, Debbie. So a lot of people are afraid to ask these questions because people have this fear of being found out. Okay, everybody feels like if I asked this question, I'm going to get the wrong kind of attention. And my response to that is that you are not alone. A lot of people are in the same boat as you. But the second thing is like, not asking the question and thinking that things will work out is a bit like standing on the railway tracks, closing your eyes, and pretending that the train will hit you; the train will soon come at you. A lot of companies think they're like Tom Hanks in Castaway alone on the island, nobody helping them out. And my response to them is, you guys are like Tom Hanks, but not in Castaway. You’re like Tom Hanks in The Terminal, the movie where the guy's in the airport and has nowhere to go? Because his country has been taken over. Right? So my response to them is to ask these questions. But the first thing you need to do is don't worry about asking these questions outside; get your attorneys to get your engineers in a room together, understand how data flows within your company understand where your data lives. So I have a diagram in the chat, the first chapter of my book that talks about how data flows in a typical company, how it enters the ecosystem, how it gets copied across the board, how the data size grows, how it ends up on engineers, laptops, that diagram is not just meant for one or two companies, it's meant for multiple companies, it will give you a sense that you are not alone really make you realize that everybody is in the same boat as you. And that is why people like me exist because this problem was not built yesterday, it was 10 years in the making, and you're not going to solve it in 10 minutes. So ask these questions have these honest conversations. And really, if you have these conversations, you might end up building a community of folks that have an interest in solving this problem together. And you will also realize that people in government don't have all the facts either. So rule number one in a problem is acceptance like you have to accept the fact that things are bad. And that's when things will start getting better. But this fear of failure is not helping anyone right now. So I'm here mainly to tell these companies that, hey, you are not alone. Speak up, and you will find out that there are a lot of allies who need as much help as you do.


Debbie Reynolds  41:59

I would love for you to give us a bit of executive advice. Right? So just talking to you. Even when we had chatted previously, I was also excited to chat with you. I knew we were going to have a lot of fun, which we have definitely on this call. But what advice would you give or what thing that has helped you rise in your career? So a lot of times when people think about privacy, they're like, well, I can't rise within my company and privacy unless I'm a lawyer. Right? So we know that there's this whole other area, many areas, and privacy, which aren't necessarily within legal. So what has helped you ascend to all these great opportunities in your career? And what advice would you give to someone who is like us, right? Technical folks, that have taken an interest in privacy?


Nishant Bhajaria  43:00

Definitely. So people always ask me this question. Okay, what happens to my career? Now, I'll tell you something interesting. Your privacy engineering forces you to do something that very few people in your company are able to do; it forces you to learn about how data works end to end, it forces you to understand how your infrastructure is actually laid out. It forces you to understand whom your company is partnering with from an API vendor perspective, like who are you sharing data with? Is it an ad network that sells ads? Is it a government regulator that wants to check your company for one reason or the other, your ability to understand the span of your company from left to right will be unparalleled when you go into privacy like there is just no way to excel in privacy without that knowledge? So I would basically say if you have engineers in the company, who are shipping products, they have their moment of glory once every six months, nine months, or 12 months when you have a major release. But in privacy, every single day, you could be addressing the risk that could cost the company a ton of money. You could have an important say in shaping internal laws and regulations within the company. You could have an important say in which vendors your company partners with. You will have a say in which company you're you know which company or your employer buys or sells your ability to have a cross-functional impact will be unlike any other role in the company. That's number one. How many other engineers can work with attorneys, policy experts, business leaders, market owners across the company, its privacy engineering. The reason I was able to get to sort of my role and my levels across the board is because, from my early days, I learned how to talk to people who are different from me, people on the sales side, people on the data management side, people on the legal side, I remember in one of my earlier career jobs, I used to work for a company where we sold GPS systems for farmers. So basically, these are farmers in the Midwest who have to spray their fields with fertilizer with seeds and things like that. And this was in 2005 2006 when gas prices were at four bucks for the first time, and I went on one of these sales tours with my with people in my company. And at the time, a lot of the engineers looked at the sales team like these folks are the enemy we try to build tools. The sales force constantly asked for features that were not tenable. And that trip that I took for the first time made me realize how hard it is to sell stuff. It also made me realize how important it was for my tools to work efficiently. Because these farmers didn't make too much money, their margins were pretty low. And for them to be able to use my tool correctly, it was important for me to directly understand their concerns. So where all of this goes is when you work on privacy engineering, you will put together the point of view of the customer, the regulator, the government representative, the CEO at your company, you will understand all of their points of view and be able to build a program that benefits them all. That will give you the level of visibility and self-confidence that is very hard to get for engineers; typically, you can speak to the board of directors as well, right. And that will make you so marketable that I think it's impossible to compare that to any other role in the company. And more broadly, all these conversations we're having about trust, platform security, fairness, equity, platform integrity, fake news, all of these conversations will require the same level of cross-functional visibility, cross-functional partnership, and data-driven collaboration that privacy makes possible. So maybe you get into privacy today. But five years from now, you could be working on the trust side, you could work on misinformation, you could work on overall platform security, you could work on physical security, you could work on making your platform safe to the point where somebody can use it to get social justice for their cause, right. So if you look at the image behind me, there's an image of elephants, and I work with a lot of rescue organizations, we raise money to protect elephants from poaching; all of these possibilities have become real, partly because of the work I do on privacy, which in turn gives me credibility and financial footing to help these causes that I care so deeply about. So don't reduce privacy to a purely legal concept. It's an important concept. It's important for legal to be part of it. But engineering has a big role to play as well. And I promise you, if you handle this role correctly, it will open up possibilities that today you may not even know exist.


Debbie Reynolds  46:52

Absolutely right. Because these things are being created, I'm sure your title didn't exist 15 or 20 years ago, so don't look for a job. Just create one. That's right. That's the best advice ever. Thank you. Thank you. So if it was the world according to Nishant and we did everything that you said, what would be your wish for privacy anywhere in the world, whether it be technology, regulation, human stuff? What are your thoughts?


Nishant Bhajaria  47:19

I would just have two thoughts, and I have two thoughts on everything, so when it comes to privacy, the first easy thought is to make it a product. If you think of privacy as a product as a feature within your company, you can communicate with today's customers a lot more directly and a lot more confidently. The risk, the fear that hangs over your head of can I ship this in time, or what happens in the next stuff passes, will go down significantly if you think of privacy as a product, that something that your engineers across the company can use. So that's goal number one. But the second goal is let's have a society conversation about what it means to build a platform that focuses on trust and customer relationships, right? When you have a tech industry that creates so much wealth, it's like, but doesn't create so much work. It breeds distrust. It breeds animosity. And the reason the tech industry is facing a backlash right now is because we have done very well throughout the pandemic. If you look at stock prices, again, ignore the last couple of weeks, you will look at stock prices. If you look at profit margins, we've done very well, just as people across the world have suffered enormously. And I think this gap, this distance is not good. When the lines of identity get blurred because of the disruption of the tech industry, it again breeds distrust. So my hope is that privacy can serve as a bridge between those of us that produce tech products and the customers who use them. If we can bring each other a little closer if we can build these bonds of communication a little more continuously. That would be fantastic. And I feel like the book I've written, the certification that I'll be producing over the next couple of months or months on privacy engineering, is aimed at creating this new breed of privacy engineers that are not just engineers, they're advocates, they're attorneys, they are leaders, and they can represent their cause not just within the company, but outside the company and create this discipline that takes this conversation to a whole other level. That's kind of what I'm talking about. And when we get there, I'm saying when not if we will be able to build platforms that will do a much better job addressing fake news, misinformation, physical risk, all the other problems we're talking about. This is an incredible opportunity. And I hope my fellow engineers don't miss out on this.


Debbie Reynolds  49:16

Wow, this has been a blockbuster podcast; you really dropped a lot of knowledge on us today. And I'm really excited that your book is doing well. I hope a lot of people go out to buy it. This is fascinating. So I'm sure we'll be chatting more in the future. Happy that you're able to be on the podcast, and it's been great having you here. Thank you so much.


Nishant Bhajaria  49:39

Thank you for having me here. Thank you so much.