"The Data Diva" Talks Privacy Podcast

The Data Diva E184 - Sharon Bauer and Debbie Reynolds

Season 4 Episode 184

Send us a text

Debbie Reynolds “The Data Diva” talks to Sharon Bauer, Founder of Bamboo Data Consulting in Canada. We discuss the complexities of the privacy landscape in Canada, including the outdated federal privacy legislation PIPEDA and the challenges posed by new technology. We emphasize the importance of staying informed and proactively addressing potential legislative developments while acknowledging the nuances and complexities of advising clients in the evolving legal landscape. We also discuss the evolving landscape of privacy in the digital age, highlighting the disconnect between privacy professionals' perspectives and consumer behavior. Sharon emphasizes the critical role of trust in driving consumer action, loyalty, and data collection for companies, stressing the need for companies to prioritize building trust with consumers. We explore the challenges companies face in comprehending and adhering to privacy regulations, including the lack of education and transparency, particularly among medium-sized businesses. We also discuss the multifaceted issues surrounding privacy and data protection, including the implications of data misuse, the need for informed consent, and the long-term consequences of data disclosure. We express frustration with the limitations of automated privacy assessment tools and emphasized the need for tools to consider businesses' diverse operational and ethical contexts. Sharon shares her frustration with the operationalization of privacy and stressed the importance of humanizing the process. We also discuss the importance of using real-life examples to educate companies about privacy missteps and Sharon’s hope for Data Privacy in the future.

Support the show

47:38

SUMMARY KEYWORDS

privacy, companies, data, privacy legislation, information, people, tools, law, legislation, work, ai, disclose, give, love, business, trust, debbie, impact, breach, understand

SPEAKERS

Debbie Reynolds, Sharon Bauer


Debbie Reynolds  00:00

Personal views and opinions expressed by our podcast guests are their own and are not legal advice or official statements by their organizations. Hello, my name is Debbie Reynolds; they call me "The Data Diva". This is "The Data Diva" Talks Privacy podcast, where we discuss Data Privacy issues with industry leaders around the world with information that businesses need to know now. I have a very special guest on the show, all the way from Canada. Sharon Bauer is the founder of Bamboo Data Consulting. Welcome.


Sharon Bauer  00:38

Hi, Debbie; thank you so much for having me.


Debbie Reynolds  00:41

I'm happy you're on the show. But it makes me so mad because I know you really well; you and I have collaborated together, and I've been excited to have you on the show. So I'm happy you're here now.


Sharon Bauer  00:51

Thanks for the honor.


Debbie Reynolds  00:53

Well, to me, you have a fascinating trajectory in your career and how you got into privacy, and I would love for you to be able to tell people your background and how you found your way into this career path.


Sharon Bauer  01:08

Yeah, so I started my career off as a litigator. From a pretty young age, I always wanted to have an impact on people's lives, and I felt that going into law was the way to go about doing that. I quickly realized that I wasn't as passionate about being a lawyer as I thought. I think that's not an uncommon feeling amongst most lawyers. But to be honest, I had no idea what else I would do. About 10 years after practicing, I stumbled across the issue of privacy, and I was immediately fascinated. I have a background in philosophy. That's what I went to university for. I think I was fascinated by privacy because of the philosophical issues relating to privacy and the impact it has on individuals, but I was also fascinated by the impact it has on the business, and the reason I was fascinated about the business is because, as a lawyer, I think what I loved about the law the most was the business of law more than the law itself. As you may know, law firms are not all that innovative and are quite archaic in the way that they practice. That always bothered me, and I was always looking outwards to see what other companies were doing. So, bringing in the business side, the philosophical side, and just the human element of privacy made it this incredible recipe for me to be extremely fascinated and passionate. So once I figured out, okay, privacy might be the way for me to go forward. I gave it a shot. Unfortunately, not many law firms were willing to give me a chance because I had no experience in privacy. I was a partner at the time. So they felt like if they brought me in as a junior, I was a flight risk. So I had this big challenge in front of me: how was I going to prove myself? So he took it into my own hands I studied at night, I was following everyone I could possibly follow on LinkedIn and trying to understand what the sexiest topics in privacy were, at the time, the most relevant. I started doing a lot of writing in privacy to try to build a portfolio for myself so that when the opportunity came up and someone gave me the chance, they would see that I really am very committed to a career in privacy. So lucky for me, I had some incredible mentors and people who did give me an opportunity and the hard work paid off. I joined a Big Four accounting firm, where I got my first start in privacy. It was a huge kind of learning curve for me coming out of a law firm into a consulting firm, and I learned a ton. About two years after that, I thought I might want to try this on my own in a different way, in a more creative way. I felt that maybe, having come from a boutique law firm and going into a big corporation, I saw the difference between how the two can make an impact on clients, and I think I just want to try it my own way. So Bamboo came to be in January 2020; I think it was February 2020, literally a month before COVID. Then COVID came around, and that was a huge challenge, but we overcame it, and in fact, maybe it helped us, maybe it didn't. I don't know, but I'm having the most incredible time. My risks, I guess, paid off, and here we are.


Debbie Reynolds  04:58

I love that story, and something you said in a story that I did not realize, and it probably explains why you and I get along so well, is because I was also a philosophy major in college. Oh, my mother was horrified.


Sharon Bauer  05:15

Yeah, well, my parents weren't that horrified because they knew my angle was to be a lawyer. I think when I told them I no longer wanted to be a lawyer, that's when they became horrified.


Debbie Reynolds  05:23

Well, yeah, I use those skills every day in terms of being able to write and think and reason about the way we do different things. I think it's so cool that you decided to trail your own path in privacy. I think a lot of people love the story because they don't know how to get in, and I tell people, being able to get yourself out there lets people know who you are. Let them know what you think about privacy, is really a great stepping stone. What do you think?


Sharon Bauer  05:58

Thank you, Debbie. For me, my end goal was to try to be as authentic as I could be in the work that I do. It's really hard for me to compartmentalize work and my personal life. I wish I could, but honestly, I can't. I haven't learned that skill yet. I don't know if I ever will, and I'm a mother of two; I literally bring work home with me as well. I work from home, but I also bring myself, my family, and my personal goals into work as well. So I don't see the distinction between the two. For me, when I started Bamboo, it was all about how do I live my best life doing the work that I'm super passionate about and not have to give up one thing over the other, give up family over work or work over family. Sure, there are days where the pendulum swings on one end, and I'm working more than I should be, worried that I should be more with my kids than I am with work and vice versa. But I love what I do, and I'm a happier person. I think that that helps me be a better person, whether that is at work, or at home, and I hope that I'm setting a good example for my kids and other people who are looking to be passionate about what they do.


Debbie Reynolds  07:21

I think you're setting a great example. So I can tell you, you are succeeding, my dear. Now you're in Canada, and there's a lot going on in Canada right now around privacy. Can you give us the lay of the land? What's happening? I get so confused about all the different things that are happening. But tell me what's happening right now.


Sharon Bauer  07:40

You get so confused with what's happening. Oh, all right. Let me try this. So, for those who aren't as familiar with the Canadian landscape and privacy, we do have Federal privacy legislation. Unlike in the US, that's called PIPEDA, and PIPEDA has been around for like over 20 years. And as you can imagine it's highly outdated, especially with all the new technology that's out there. So to make sure that we're kind of staying on track, we've had case law that tells us how to do privacy. Well, we have frameworks, we have guidance, and we've had a few amendments to PIPEDA. For example, in 2018, I believe it was 2018 when we had mandatory Breach Notification implemented into PIPEDA. The other thing that everyone should know is that we have an adequacy standing with the EU, which means that EU data can freely pass into Canada. That is dissimilar to what's happening in the US, where the US does not have an adequacy standing. You should also know that British Columbia, Alberta, and Quebec have their own privacy legislation. So, those are three provinces which are substantially similar to PIPEDA. Then we also, to make things a little more complicated, have some sector-specific privacy legislation. So, for example, in health in Ontario, which is another province, we have PHIPAA, which is, as I mentioned, all about health and privacy. So it is a little bit piecemeal, but I would say maybe not as much as in the US. Now, for several years, there has been a concern that PIPEDA, the Federal privacy legislation, was outdated, and because it's outdated, we were very worried that we would no longer have an adequacy standing with the EU, which would impact, potentially, the economy because EU data and UK data would not be able to freely pass into Canada. So, there has been a lot of talk about reforming PIPEDA, and in November 2020, there was a bill called C 11, which was supposed to reform PIPEDA. Unfortunately, that bill died with the dissolution of our former government but was reintroduced as Bill C 27, which is still alive and well. Now, a few weeks ago, we did get an adequacy standing yet again from the EU, I think, to some of our surprise. So that pressure to reform PIPEDA may now be dwindling a little bit because we got what we wanted. However, with the adequacy standing, we did get some feedback, hinting that we really need to enshrine some of those protective measures that we put through case law or guidance or framework into our legislation. So, not all is lost; we're still all hoping that there's going to be some reform to pick it up. It may not be as soon as we were all hoping. But I'm hoping that in the next few years, we'll see some significant changes. In addition to the amendments to PIPEDA through Bill C 27, there is also an AI component to it. It's called the Artificial Intelligence in Data Act. It's part three of the new legislation that, of course, has not yet passed, it's not law, it is still very much part of that bill. It is getting closer to being passed. It's fascinating that they've put this on top of Bill C 27, as opposed to its own bill. There's some debate on how quickly can we pass Bill C 27. If we have this AI Act attached to it, that has a lot of things to unpack, and there's a lot of contested components to that AI bill. So again, we're not sure when that will pass; there's been discussion, and it's not going to be before 2025. But companies are still very much aware of IT companies in Canada and are trying to figure out what they can do in preparation for this kind of bill. Of course, there has been some guidance from our regulators around AI, including Generative AI, how to implement it properly in businesses. But right now, we're in a bit of a state of flux in terms of are we going to get new privacy legislation, are we not, but of course, the best foot forward is to be proactive and plan for what is to come whether or not it comes.


Debbie Reynolds  12:54

I love it. I love my Canadian neighbors, obviously, and I wish that by osmosis, we can have some of this Federal privacy legislation ethos in the US.


Sharon Bauer  13:09

I wish you would. It would make my life easier too, when I have to advise my clients on the US right now. It's just like, holy moly, it's confusing, man.


Debbie Reynolds  13:20

It's incredibly confusing. Then I think the thing that just gets me really upset is when jurisdictions, instead of passing a new privacy law, they'll add a privacy part to an existing law, maybe it's like cyber law or breach law. So, the number may not change and may not look different on a piece of paper, but the nuances are there, and that just makes it more complicated.


Sharon Bauer  13:46

It is, but at the same time, you don't have to learn a brand new law; you're just learning a new part of the law, which is also kind of nice. For me, I'll speak for myself, although I know that I've had these discussions with other privacy professionals, where it's like constantly, a new legislation is coming out. Now you need to learn that one, and at the end of the day, they're all kind of the same. They all have the same principles, but it's the nuances between all of them, and how to advise your client and not forget about that little piece of the legislation. I get something nice when you have an existing legislation, you know, really well, and now you just need to learn another little part as opposed to a brand new legislation.


Debbie Reynolds  14:32

Yeah, well, I think it's hard when people do those trackers because the numbers on the laws don't change, but there have been changes. So I have a fight about a lot of those maps like this is not accurate.


Sharon Bauer  14:44

The other thing is if we really hated reading the legislation and learning about new legislation over and over again, we would not be in the industry that we're in because literally every day, there is another thing that you need to know. It's challenging, man. But it keeps you on your toes, and it's exciting, right?


Debbie Reynolds  15:06

Yeah, it is. Well, I love to learn.


Sharon Bauer  15:09

The exact reason why maybe I left the industry that I was in, previously, where I felt it was so archaic. It's the same thing over and over again, feeling so jealous of people who had to learn new things and were on their toes. I guess I got what I asked for, even though it's challenging.


Debbie Reynolds  15:26

Yeah, I love it. Well, I don't mind reading. It doesn't bother me. But a lot of times, I have to say, hey, I actually read this law. Here's the change people aren't thinking about. What's happening in the world right now, as it relates to privacy, that concerns you most?


Sharon Bauer  15:44

That concerns me most. For me, I guess it's a little bit of the legislation, the technology like humans, it all concerns me; I think my biggest concern is that everyone is looking at privacy from their own skewed lens or angle, that we're not actually addressing the real problem. What I mean by that is, as privacy professionals, I think we're biased; we think that people really care about their privacy, and if we put our privacy lens aside, what we see is that consumers may not be as concerned about their privacy as we are. This has fascinated me for a very long time because we are preaching something that consumers don't actually care about, and then it's a lot actually addressing the big problem here. In my study of this, what I've realized is not that people don't care about privacy; it's that they have resigned because they feel like they do not have a choice. I think that depending on where you live, your socioeconomic status, your education and literacy, your care, or your resignation to privacy is really dependent, and we're trying to wrap it all up in one package when it's not that easy. So for example, individuals, probably a lot of people, not just privacy professionals, but a lot of people through now a lot of education around privacy and discussions and news and all that stuff. Before they click the I agree button to anything, they may have this moment of cognitive dissonance where they feel really uncomfortable about what are they giving up. But they're very quick to put horse blinders on and click the I agree button because they want the convenience of getting that tool or that service that they really want. So they're willing to give up their information for convenience, and it's not because they don't care, because they did care. They had that moment where they cared. I don't feel comfortable giving up my information, but I'm going to do it. It's because they've resigned, and they've resigned because if they don't click the I agree button, then they are not going to remain relevant. They're not going to be able to participate in society with everyone else; it's doing the exact same thing. I see that with my kids, even where they feel like, oh, my God, they're going to be left out if they don't do this thing that their friends are doing. Not that I like them. But that's another story. So I do think that, and I know this is a taboo thing to say, but I've said it before, and I said it on privacy day last year; I do believe privacy is dead. I do believe that information or personal information, little data crumbs are out there. We give little bits at a time, feeling like, okay, well, just give a little bit. But at the end of the day, there are technologies that can collect those little crumbs and put it back together and create another Sharon Bauer that is probably more accurate than the Sharon Bauer that I know. That is my concern, and so if privacy is dead, do you squeeze out all the toothpaste and you can't put it back in the tube? Right? It's all out there, and you can't get it back? Then what do we have left? What I think we have left is trust, and it is up to companies to determine whether we are going to try to earn the trust of consumers. So, as a consumer, fine, I've resigned. I'm going to give up my personal information to get that thing that I really want, but do I trust this company? Am I okay giving up my personal information to this company? There are certain companies that I think most of us do not feel comfortable with, and then there are some that you're like, yeah, I wanted to give you even more of my information because I really trust you; you have demonstrated trust. So, for those companies that are thinking in the trust lens, then I say trust drives three things. Trust drives action because consumers will flock to you because they trust you, or at least will leave those companies that they don't trust. Because you've demonstrated trust, consumers or trust will drive loyalty. So yes, every company is going to be breached in one way or another, whether it's their fault or not. But if you've demonstrated trust through the years, your consumers are not going to necessarily leave you; they are going to remain loyal, they'll defend you, and they'll even advocate for your brand because you've demonstrated a trusting relationship with them. Not all is lost because of a breach. Then finally, trust drives data. So if you trust a company, not only are you going to stay with that company, but you might be willing to give up even more of your information to them, which we know data equals profit. A company that demonstrates trust will have greater profit. That is the thing that concerns me, Debbie, that privacy is dead, but trust is not, while of course, we need to still protect personal information. So, to advocate for privacy, I think we really need to focus on how we build trust with consumers because companies are not going to stop collecting information. Especially with AI now, there's even more access to information and more things to do with that information that can easily lose the trust of consumers.


Debbie Reynolds  22:00

Wow, I love that. So I've heard other people say this, and I love the way that you put it around trust because I think trust is golden, right? That's what companies really want. So they really want that high-quality data because if people trust you, they'll give you good data that is accurate, and they'll give you more of it if they feel that it benefits them. I guess one of the concerns I have, and maybe it connects with and aligns with trust, is that I see more and more companies, unfortunately, doing things with people's data that do not benefit the person. So that's a clear way to erode that trust. What do you think?


Sharon Bauer  22:43

Well, of course, I absolutely agree with that, you know, having worked with a lot of companies. Luckily, I work with great companies that really want to do the right thing. There are times when they don't know if they're not allowed to do certain things with data. So a lot of companies, I'll say this, there are many, many companies, and I'm gonna say more companies than not, that don't mean to do harm. Don't mean to misuse information; they just don't know that they cannot do that thing with the information that they're doing. Also, I think that a lot of consumers have no idea what companies are doing with their information. That goes back to lack of education and lack of transparency; and if a company doesn't know that they're not supposed to be doing something with the information, it may not even know that it needs to disclose that it's doing that thing with their information. So I'm finding that companies have a very immature understanding of privacy, which is not so much the case necessarily with security. Their focus is so much on security, as it should be. One of the ways to protect personal information is through security. It's fascinating to me how companies say we have security, so we are doing the right thing with information. We're protecting it; very little understanding of what they can and cannot do with personal information. Or they say we don't have personal information. Oh my gosh, that's like, that's one of the biggest misconceptions out there. Oh, it drives me crazy. But I do want to give a lot of companies the benefit of the doubt. I work with medium-sized businesses. I'm not working with the giant enterprises here. The medium-sized businesses, I think, are just not mature enough to know or educated enough to know what they can and cannot do. Most of them do not mean to do harm. Once you inform them and educate them and build a culture around privacy, they really want to do the right thing.


Debbie Reynolds  24:59

I think that's true. So I think part of that, in my view, and you hit the nail on the head about them not knowing what they can and can't do with data, I think the idea has been, from a security standpoint, as long as we protect everything in the castle, it doesn't matter what happens with the data inside. We know from a privacy perspective it actually does matter, depending on what the data is and what you're using it for. Being able to have companies understand that message and understand how they need to change in terms of the the way that they operate, I think it's very important. So I feel like sometimes the legal advice is untethered from the operational actuality of what's happening within organizations. What do you think?


Sharon Bauer  25:46

Absolutely. So, I do feel that sometimes, when you interpret the law, you tend to interpret it as black or white. But the operation is all about the gray; I wrote an article called 50 Shades of Privacy because it really is all about the context. So, a lot of the companies that I work with really get ingrained in their business and operations, and we work with the different functions. The biggest problem that I see is they're collecting personal information for one purpose, and then another function, usually marketing, will take that information and use it for a different purpose. They justify it in two ways. Number one, all of our competitors are doing this. Number two, no one has complained. So our customers don't care that we do it this way. That, again, goes back to the thing that I talked about before with the resignation, what are customers going to do if this is the only option that you're providing them with? The collection, use of, and disclosure of their information, which is, again, how privacy legislation is trying to bring in privacy rights in choice and transparency. But even with that, and I see it a lot with ad tech and marketing, you do give consumers a choice. But guess what? They have no idea how to even use that choice because they don't understand what you're doing with it behind closed doors. That, of course, is the big problem; then how do you explain it to them in a really easy, clear, friendly way, as the regulators want us to do, whereas the people who are implementing it in the business, a business can barely even understand what they're doing themselves? So this is such a problem. How do we break this down? Even when we break it down and give the just-in-time consent and notifications, it creates a disruption in the seamless process that a customer wants to go through a journey to use a product; they don't want to see those pop-ups every two seconds. That's really annoying. So then, how do we get past this hurdle? I don't necessarily have the answer, Debbie; I wish I did. I think if someone had the answer, all of our problems would be solved. But this is the problem.


Debbie Reynolds  28:23

Yeah, I think that's true. Actually, an example of what's happening with 23andme right now, I think it's very instructive because I have family members sign up and did this thing, obviously, to my horror, because I was like, oh God, don't do that. But very innocent. Hey, I want to know, especially if you're black, hey, what's my lineage? What am I made of, and stuff like that? So that's a very innocent thing, right? Or some people say, well, I want to know if I'm predisposed to certain ailments based on my lineage or heritage and stuff like that. They didn't intend for their data to be in police databases. They didn't intend for this data to be used against them if they get, let's say, denied for insurance because they're predisposed to some ailments that they actually didn't have and things like that. So the breach of that data, I think, is bringing a new eye for people around the bad things that can happen when your data is misused. What do you think?


Sharon Bauer  29:25

I absolutely agree with it. In this situation, we're dealing with very sensitive information. We're not dealing with someone's email address, although in certain contexts, of course, email address can be sensitive. But this is where the company needs to demonstrate responsibility for someone's very sensitive and intimate information. We see more and more companies making mistakes about the misuse of data, disclosure of data that shouldn't be disclosed, or lack of security. Unfortunately, it makes the news, and unfortunately, people are impacted. They have to learn the hard way to actually start to care a little bit more about their privacy, and I've said in other talks that I've done that we're on this cusp of people resigning kind of caring because they may have been impacted personally by a breach or can relate to other people who have been impacted and saying, oh, my God, thank God, it wasn't me and start to internalize, what am I actually giving up? That's where that cognitive dissonance comes in. Or be, oh, do I click it? Do I give up my information, or do I not? Unfortunately, we're going to see so many more of these breaches that maybe that is the only way to actually educate people on the implications and the risk of giving up their information. Unfortunately, though, for most of us, again, it's just giving up little crumbs of our information. One day, possibly, those crumbs come together to create an individual just like you to disclose information about you that you never intended for anyone or any company to know. We kind of see that a little bit in ad tech, to be honest, where there are companies who may be looking on social media or websites that you visit and start to make some predictions about you to create a profile of who you are and advertise you now, that might not be that sensitive, necessarily. Sometimes it is, but it may not be as sensitive as the 23andme situation, where that is really sensitive stuff. But how do you know that one day in the future, we're not going to have some government that will say, I want to collect all this information about an individual, and what kind of impact that will have on you? So, for example, in Roe versus Wade, where a woman is trying to get an abortion pill, and she can't. She may be driving to a certain location to pick up a pill or to a clinic and not realize that she's disclosing location information about herself. Somehow she's found because of the location that she attended, which was a clinic; this actually happened. Data brokers are able to collect information and sell information about women who were visiting abortion clinics; these are little crumbs that we're putting in place, not realizing the implications down the road. That is what concerns me the most.


Debbie Reynolds  32:53

I agree, I agree with that concern. One thing that Anne Kavoukian, who's also Canadian, the former Privacy Commissioner said, she was on the show. Her view, she said privacy isn't religion, and so if people want to give away their data, they can. But for me, I think people need to know what they're giving away. They need to know what those implications are so that they can make an informed decision. What do you think?


Sharon Bauer  33:21

Absolutely. I mean, how can I not agree with that? It's just a matter of how do you inform them? How do you inform them enough so that they really understand the implications? And to be honest, what are the implications in the future? You don't know; the company may not even know what the implications are in the future should the laws change, and they must disclose this kind of information. No one really knows, and it is a very personal decision. This is exactly the reason why I love privacy so much and the philosophy behind it. Because there's so many angles of how to look at this. It's like, well, it's my right to give up my information and do what I want with it. But then, I don't know what the implications are 10 years from now; this data is not going away anywhere, right? Like, yes, we have all these retention policies. No, my data is everywhere at all times. No one's getting rid of it. As much as we say that you must do what the law says. And it's still out there.


Debbie Reynolds  34:26

Yeah. I'm going to segue a bit just because you've mentioned it, and we're going to go there. One thing that I found that some men don't seem to realize in the US is that because of the Dobbs decision, women in the US have lesser privacy rights than men do.


Sharon Bauer  34:46

Mm hmm.


Debbie Reynolds  34:48

Have you thought about that at all?


Sharon Bauer  34:51

That women have less privacy rights than men do?


Debbie Reynolds  34:54

Yeah.


Sharon Bauer  34:55

I don't think they have less privacy rights. I mean, the law doesn't say that men have more rights than women. I think that due to circumstances, for example, the new anti abortion law, women are implicated more by the disclosure of their personal information. I think that that can also be said for marginalized individuals or individuals with a lower economic status who may not necessarily understand the implications of giving up their personal information. So I think it is a systemic problem. I don't think that one has less rights than the other, though, we all have the same rights. It's just a matter of education and circumstances, which is unfortunate.


Debbie Reynolds  35:54

Yeah, very interesting.


Sharon Bauer  35:57

With the whole Meta, where they've changed their consent structure so you don't have to be targeted by ads if you're paying for the service. But if you're not paying for the service, because maybe you can't afford to pay for the service, then you are going to be targeted by ads, and your personal information will be disclosed to marketers. So it's kind of the same situation there too. That is a problem that we're going to maybe face more and more even with AI potentially, and having AI assistants, where some people can afford it and other people can't and the different levels of privacy protection in there, that we're actually going to see more and more of discrimination when it comes to privacy; it's gonna be really fascinating.


Debbie Reynolds  36:54

Well, so when I was saying women have less rights, I think, based on what you said, and I say this part a lot, that I think we're creating almost like a new caste system, where people who have access to data and have access to either to be or not to be, to choose not to be in data systems, or have that type of tracking, you will have to have more agency than people who do not, what do you think?


Sharon Bauer  37:25

I do agree with it. Again, I 100% agree with women being in a different caste. But I think I just look at it more broadly and not just at women. I'm thinking about these days, those individuals who attend protests, whatever side you may be, whatever issue you may be fighting for, and tracking your location on your device, and then understanding some of your beliefs, political or otherwise, and how is that going to impact you, when you may not have necessarily meant to disclose where you were? I mean, you're at a protest, what, hundreds of thousands, maybe millions of people, but that is your decision in terms of how you want to portray yourself or have people know certain things about you. But that may come out in the open, and you didn't even realize that you may be penalized, are implicated because you attended a particular protest. That's another issue.


Debbie Reynolds  38:26

Lots of issues; we have so much work to do that way.


Sharon Bauer  38:30

Also, we didn't even talk about facial recognition. We talked about location tracking, what about facial recognition, when you're at a protest, and how that's going to impact you? So there's so much to uncover.


Debbie Reynolds  38:43

Yeah, yeah. Well, if it were the world, according to you, Sharon, and we did everything you said, what would be your wish for privacy or data protection anywhere in the world, whether it be law, technology, or human behavior?


Sharon Bauer  39:01

Maybe put it all in one. I know I'm kind of going in a very different direction, and I'm going to be a little bit more selfish in terms of my business and how we operate our business. But the one thing in privacy that I'm, how do I put it, I'm not impressed with, and I'm surprised that we don't have something better: our privacy tools for companies to use. So again, I'm more about the operations of privacy. I have been exposed to a lot of automated privacy assessment tools, where it's plug-and-play. A company can answer a bunch of questions based on controls that come out of the legislation, and based on their answer, which is usually a yes or no, they get a rating, or they start to understand what their privacy maturity is. I've played around with these tools, and it upsets me, Debbie, it upsets me because it is not black and white. We talked about that just earlier. It's not yes, I comply with this particular control or I don't.  A company is usually somewhere in between and how do you manage that? Most companies don't even understand what a control is, let alone a privacy control, and it is complicated. These tools, unfortunately, do not consider the environment in which the company works, the moral or ethical considerations for that particular tool that the company or service that the company is offering, or the social context that the company is working within. So it frustrates me a little bit that there are companies who feel that they can engage with these tools and do not get the right instructions on how to implement privacy in their operations. They think that they're saving money by using these tools that may then give them recommendations on other tools to use and bring in when maybe half those tools and recommendations are not even relevant. They end up spending more money to remediate them than they actually need to. As well, these tools take a very high-level approach to the business. But every function operates on its own and is almost like its own business. So, how do you put the risks for every function under one high-level umbrella? I have not yet seen a tool that can do this really well in Debbie; if there was one, I would love to use it. I do this manually with my clients; I speak to different stakeholders and understand their business in their different functions; it would make my life a lot easier if I could find one of those tools and work with my clients on that tool itself. But it doesn't exist, and it's shocking. But I do believe that one day, we're going to have such a tool, and I do believe it will include AI to consider some of those contextual elements in the assessment. They will base it on the company's industry and consumer expectations and not just the law. So that's my wish.


Debbie Reynolds  42:40

Oh, that's a great wish; I love it. I'm glad to mention that I think I have the same annoyance when someone says oh, we're going to implement this, blah, blah, blah, and this other framework. I'm like, well, that doesn't actually get down into the weeds truly, about how people are actually handling data. You're right; it's like a scale; it's not a yes or no, it's not a black or white. It's more, I feel like we're trying to answer a qualitative question using a quantitative method.


Sharon Bauer  43:14

Exactly, at least in Canada, PIPEDA is more of a principle-based legislation. There's nothing too proscriptive about it, and so most companies take a risk based approach with it. A lot of these tools are not really built to take a risk-based approach. It's either you do it, or you don't. That doesn't work in most companies, and certainly not in every function. For every function, you do need to look at them slightly separately to understand how they operate, like marketing is going to be completely different than, say, your HR or your sales or your ops. So it's a little bit frustrating, and then there are other tools that do operations. Some of them actually do well; I do have to admit, my frustration there is there isn't an amazing tool that does all the operations like the data mapping and the D-SARS and the vendor due diligence and the PAs all in one that is also easy to use. So there's a lot of frustration in terms of the operation of privacy. I think that also is a disservice to privacy and the privacy function. Because if you're not going to make it easy and you're not going to humanize it, then it becomes another really daunting task in the business that no one wants to do.


Debbie Reynolds  44:49

Wow, that's a lot to think about. So let me ask you this. Do you think that this is a problem that can be solved with technology?


Sharon Bauer  44:59

I do. I don't want to say it's not out there. I just haven't seen it yet. One that encompasses all of my wish lists, or my wishes on my wish list, I should say.


Debbie Reynolds  45:15

Well, we have a lot of people who are in the business of making tools who listen to this show and fund those tools, so I hope that they will definitely listen to what you're saying; what you're saying is absolutely true.


Sharon Bauer  45:29

They're welcome to reach out to me, and I'm very happy to give them my perspective because what I do day in and out is operationalize privacy and try to make it simple for my clients to use so that they actually use it. If you're going to complicate things, forget it, no one's going to want to do this. We're here to try to protect individuals' privacy. So let's try to make this easy to do. The other thing is humanizing the process. This is where I don't think it's necessarily about technology, but it's about making privacy accessible within a company. It all starts with speaking to those people who need to implement privacy, the people on the ground who need to comply with the internal policies and procedures, and making them really understand why they need to protect personal information and privacy. The only way to do that is to make them think about their own privacy before they think about anyone else's privacy, right? Put them in the shoe, in their own shoes. What if this was your information? How would you react if you knew that your own information was being used in this way? Only then can they think about whether we are doing the right thing. So, humanizing the process and embedding privacy values in the DNA of the company, unless it's in there, I think you're gonna have a really hard time operationalizing it in your business.


Debbie Reynolds  47:10

That is fantastic. Thank you so much. I'm so excited that you're on the show, and I know that everyone else loves that as much as I do. So you dropped some knowledge on us today. So thank you for that.


Sharon Bauer  47:22

Thank you so much, Debbie; it was a pleasure.


Debbie Reynolds  47:26

All right, talk to you soon.