Security Sandbox 5.26.22
Ep 15 | 5.26.22

Data Privacy Is the New Data Security: Putting People at the Center of Everything


Amanda Fennell: Thanks for tuning in. If you enjoy today's episode, please rate and review us wherever you get your podcasts.

Amanda Fennell: Welcome to "Security Sandbox." I'm Amanda Fennell, chief security officer and chief information officer at Relativity, where we help the legal and compliance world solve complex data problems securely, and that takes a lot of creativity. One of the best things about a sandbox is you can explore and try anything. When good tech meets well-trained, empowered employees, your business is more secure. This season, we're exploring ways to elevate the strongest link in your security chain - people - through creative use of technology, process and training. Grab your shovel, and let's dig in. 

Amanda Fennell: All right. I was actually going to start singing. We thought this was the karaoke session, so thank you all so much for coming today. I'm Amanda Fennell, chief security officer, chief information officer. We got a slide, so we all remember who we are. And I will say that I know I'm definitely going to accidentally say data privacy at some point today, so bear with me. I will try to correct myself to privacy. And I have two people next to me who are going to make sure I do that every time. I also wanted to introduce - I've got a couple people joining me today, and we'll start with, always on my left, Relativity's senior director, global security and IT, Marcin Swiety. Say hello. 

Marcin Swiety: Thanks, Amanda. 

Amanda Fennell: Yeah. 

Marcin Swiety: Hello to you all. 

Amanda Fennell: And Zachary Faruque, director of Deloitte U.K. 

Zachary Faruque: Good morning, everyone. 

Amanda Fennell: And what we plan to do over the next 40 minutes is to dive into this topic that should really draw equal parts groans and cheers from everybody about - how do you create this globally relevant data privacy whenever you're trying to make a program that's going to meet these increased complexities over time? So we know there's a lot of regulations, frameworks and so on that we have to deal with already, but there's a lot of things that keep changing, and everyone's trying to figure out what's happening in the future and how to build towards that. So we know we have to have a lot of flexibility. We have to be able to update. We have to make sure we're doing the best that we can. But we have some questions that we're going to work on in our panel today to try to figure out what we're doing at these different locations and then definitely have some session open where we'll ask what you're all doing. All right. This is going to be an interesting little format because we actually typically do this in a podcast. So we have "Security Sandbox" - I'll do this for one sec so you all know what I'm talking about. I swear we have a podcast. But "Security Sandbox" - we've been covering a lot of different topics. Last season was really about where you bring some passion into security from very different areas, but this season's really been focused on how we're having people get combined with technology, which we're all familiar with, and how to be really effective at our programs. So that's why today's topic's a pretty good one. So I'll keep this on here for a bit in case anyone forgets who we are, and we're going to start with our very first question. So I don't know, I feel like I might kick this to you. So it's an area - what's one area of data privacy that will actually excite you the most? 

Marcin Swiety: Data privacy is basically a very universal right, right? It's not a privilege. It's not something that is just, you know, out there in bits and pieces. This is something that every one of us is - has a right to have. So what excites me is how that all comes together in different places of the world to actually achieve the same thing. That's kind of a very complex thing, right? We have different culture. We have different ways how we operate. We have different advancement, different technologies, different industries, and it all comes together through the same very, very basic right to have data privacy of what we produce, how we work with that, and how we actually use that to advance innovation and business and achieving those benefits all around different industries. So that's the thing that excites me, the - kind of how it all comes together to basically meet in the middle. 

Amanda Fennell: OK, so I'm going to give Marcin in a hard time, like I do always, for years of us working together. So that's exciting, though? I mean, it's an inherent requirement. We have to have this. You just are excited by how it all comes together. 

Marcin Swiety: Yeah, it's - you know, there is a little bit of that that is requiring for us to cooperate with a lot of different spaces, and that brings a lot of different people, a lot of different perspectives. And me - you know, the title says global, right? 

Amanda Fennell: (Laughter). 

Marcin Swiety: So that's what I do. That's what excites me. That's basically what brings energy to my daylight. 

Amanda Fennell: That makes sense. It's valid. OK. I do think that, Zachary, given your working background - how do you think we're going to improve this user experience? We feel like it's a fundamental thing - we should all have this - but the user experience, when it comes to data privacy - very different today than it was 20 years ago. 

Zachary Faruque: Yeah, absolutely. And I think, more specifically, when we're talking about data privacy and the challenges that are coming at the moment, it's important to know that, most obviously, we're talking about the digital ecosystem - so all of our digital channels, whether that's a website, whether that's an app. It could be even things like connected TV - the metaverse, as it comes and is upon us today. And I think it's important to also take into context why we're changing this. Why should we be focusing on this now? And a key piece is that, ultimately, all of our expectations and our awareness about data privacy, as consumers, has changed. And that's changed because regulation is trying to catch up with technology. Technology has outpaced regulation for a long period of time. Since the GDPR, we're now seeing an avalanche across the globe that we're trying to keep up with. But with those regulations coming in, the big tech companies are pushing it even farther. So the hot topic at the moment is, of course, the deprecation of third-party cookies by Chrome. That's going to have a massive impact onto the adtech ecosystem, irrespective of which side of it you sit on. But I'd be curious to know, actually, in the room - just by a show of hands, if that's OK... 

Amanda Fennell: (Laughter). 

Zachary Faruque: How many people have seen "The Social Dilemma" on Netflix? Not enough - highly, highly recommend, because it's really important to know that the youth that are coming up today are coming up in a world where this is shown to them. It's not just about headlines in the news. You don't have to go and find this information. It's right there, front and center. Hollywood is showing this to them. So I think that's really important. And I suppose - to get back to the actual question of, how do we do this and focus on the user experience? - the best example I can give is one that I experienced last year. And it was during COVID. If not for COVID, I wouldn't have experienced this. My girlfriend at the time moved in with me, so spoiler alert there about where this story's going. My girlfriend at the time moved in with me just before COVID. And I thought, well, if we can last a year locked down together, this is the woman I want to marry. 

Amanda Fennell: I felt the same. And my husband is sitting right there. I love you. 


Zachary Faruque: And so that meant I needed - the first thing you need to do if you want to propose to someone is go and buy an engagement ring. And normally, in a world without COVID, I would have said, I'm off to the gym. I'm off to play golf. And I've hightailed it down to the nearest jewelers and started looking for that perfect ring. But that wasn't available to me. I had to go online to buy an engagement ring. But I knew the minute I did - by searching for engagement ring, by visiting jewelers' websites, she'd come back from work - she was a key worker at the time - and get spammed with adverts. And one of the biggest and best surprises, we certainly hope, of our lives would have been absolutely destroyed. I sat back and I thought, this is unbelievable. The very brands that are wanting my custom are disincentivizing me to visit their digital channels because of the practices in and around the adtech ecosystem and cookies. And the first thing we want to do when we've got a website is, how do we track everyone? And not just track them on our website, we want to know where they go so that we can then send them an ad later down the line. We wouldn't even dream of doing that in the real world - hiring a private investigator to take every person that goes out of our shop, bricks-and-mortar store, follow them where they go for lunch, where they go home and then, obviously, send them direct marketing that way. 

Zachary Faruque: But the experiences can be improved. What could those brands have done to improve that? Well, when I visit the website, they could have said, are you looking for a surprise gift? And I have said, yes. And that's the first piece of first-party data they would get from me within 10 seconds of me being on their website. And then they could have said, your secret's safe with us. We're not going to track you. We're not going to send you adverts that are going to ruin the surprise. But what we would like to do is send you discounts, because as a consumer, I don't want the adverts. But I want the discount that comes with the advert when I'm three websites down the line. So if you'd like to receive those discounts on some of the products you look at today, would you be willing to provide us with your email address? And within 30 seconds of me being on their website, they would have my email address. They would have a direct relationship with me. And I've been treated like a human being. Can you imagine what their marketable universe, their customer database looks like if they take that strategy? So moral of the story, I think there are very simple things we can do. And it's about treating human beings as human beings in the digital ecosystem the same way we would in a bricks-and-mortar store. 

Amanda Fennell: That is - well, so how did it end? 

Zachary Faruque: Well.. 

Amanda Fennell: Yeah. OK. Married, OK. 

Zachary Faruque: If it'd been bad news, I'd have been too hurt to tell the story again. So... 

Amanda Fennell: With my ex-girlfriend. 

Zachary Faruque: (Laughter). 

Amanda Fennell: But, yeah - no, I think the interesting thing that you mentioned, the youth, which we hadn't talked about this ahead of time, but I do think this is a really interesting one. So I have three kids. They know way more about so much in technology than I ever would have at that age, maybe - not about Zelda but, like, other stuff. And what's funny is they are so interested in playing, like, Fortnite and Roblox and so on. And I'm so scared, as a security person, about my children. I've seen a lot of "Law & Order" episodes. OK, I'm very scared. So I've convinced them, for securing themselves and having an amount of privacy, that - to create a fake persona, who they are, like James Bond. And so they have fake names, fake age, fake city that they live in. And it's their persona that they've come up with. Now, they may end up having mental problems later where they don't know who they are. 

Zachary Faruque: (Laughter). 

Amanda Fennell: But this was the only thing I could think of because I didn't want to not - I didn't want to be the parent that was like, you can't be on any of these video games because I was playing video games. This was the only thing I could think of. And they just leaned into it. They embraced it. They love it. I would love to share their names sometime, but obviously they're private. But it's just something that - we do have a different type of consumer these days that has come about. So I think it's a great point. But the second part here - I love the idea - and I wrote this down because treating humans like humans - what a novel idea to actually consider, you know, what somebody's doing here and why they're there and what we could be doing very simply for their user experience to make it much more streamlined. So it makes a ton of sense. 

Amanda Fennell: So in security - this is a pivot - OK? - because you have security and IT, like me - high-five. Yeah, every day, all day. So we collect a lot of data as a company. We collect a lot of data about lots of different things. We use a lot of different tools to do it. We have to deploy some of those things to collect this information globally. Any rough experiences about how you do this on a global footprint? 

Marcin Swiety: It's complex. That's, I think, the rough... 

Amanda Fennell: That's it. It's complex. 

Marcin Swiety: I will try to expand on that. There are a number of challenges with that. We have different jurisdictions, right? We have different cultural finesse around how we handle that and what data - actually personal data is. So when we are doing any major rollout - and we had a couple of those, like, major transformations - or we are, you know, approaching new regions, new spaces, there are actually three things that either can go wrong or you can actually, you know, put a priority on them and you'll get right. 

Marcin Swiety: So first is you can - you should start early. Like, this - we cannot be responsive to this. Like, they - as I mentioned, these are very primary type of right, everybody of us - every user has. The control in understanding how the data is being stored, how is it being processed, where it's used. So that's one - start early. Look at, you know, what you are trying to achieve and then work backwards from that. So if you want to be in this market in two years or the next year, then work backwards what that means - what type of region it is, what type of industries you have in that region - because there is not only a part that is related to the end user, but also the entire industries are regulated in a different fashion, different ways. And there's a - as I mentioned, it's complex. So you have to work backwards to account for any big changes, any big things. 

Marcin Swiety: The next thing is you have to take your A team to that. This is - as I mentioned, this is complex. So this is not something that you can do with, you know, scrambling around a couple of folks from different spaces. You have to have a strong understanding of who is the owner of this and who is going to roll out the entire - entirety of data protection and data privacy in a global scale. 

Marcin Swiety: And the last part is you have to kind of think about the end users, actually, because it's not only following the regulation. Like, the regulations are very, very specific what you have to do to get into that space - what you have to secure, you have to produce, what you have to - you know, even tick on a checkbox. But in the end, we want to care about the users - right? - and the data that we are processing for them and, you know, achieving the business outcome of that. So sometimes you need to bring your peers from the industry, from the new region, and really ask what they are really concerned about - not only to tick the box on the regulation, but kind of listen in on what's the level of acceptance, what the level of understanding, what's the level of intent that we want to put into that effort. 

Amanda Fennell: He sounds so confident, but it's because we made a lot of mistakes, actually. 


Marcin Swiety: Yeah. 

Amanda Fennell: No, but there was one thing you didn't mention. But I remember, as we've been dealing with a lot of this, the most important part that I felt helped be successful was a local voice and representation in that planning, in that strategy. So there's a reason why Marcin and I work so closely together is because I don't want to do everything U.S.-centric. Then I'm going to build for something that's only going to work in the U.S. So every time we have to look at something - even in Germany, for example, when we started to have people who were working there - and how we would collect that data as employees, the first thing we did was start to reach out and talk to people who were local - into the office - and figure out how we would frame this in a way that would work. It's just culturally relevant and also culturally relevant to their experience because how they look at data - definitely very different than other areas. So I think there is - that was one the local voice was important. 

Amanda Fennell: Probably a second one that I would say I think we try to emphasize a lot is to, like, come from a place of being humble. Everyone feels like they've got this right. That's, like, across the board. Every entity feels like we know what we're doing and what - the way we're doing it is the right way. And I think that it's right for them. It's right for their location. But I just come from a place of trying to stay humble because you can learn something from the way these people are doing it in different places, but it may not work for the way that you do it. 

Amanda Fennell: And the last thing - I'm not going to go into this, but you definitely have the trigger word there of who's the owner. That's a whole nother conversation that'll be in a later session. But it is a question - who owns this data as soon as it's created? Is it yours because it's about you? Is it someone else's because it gets stored somewhere else? Is it archived? How long is it there for? To start talking about the location of the data, the security of it, the control of it - this becomes, I think, one of the more fascinating topics. But so global deployment, a lot of data. You don't know anything about global stuff, right? 

Zachary Faruque: No, not at all. 

Amanda Fennell: Not at all. Deloitte - not at all. 

Zachary Faruque: No. 

Amanda Fennell: What's your perspective on this one? 

Zachary Faruque: I think we see both approaches in different in different companies. Some choose the centralized approach. Some choose the decentralized approach. And I think what's really important is just, more than anything else, awareness. And I think if you can get across the entirety of your business a real idea of morals in and around everything we do with data, whether it's storing it, whether it's processing it, whether it's capturing it, whether it's sharing it, then, actually, you're not going to go too far wrong. And I did a webinar. I can't take credit for this. I did a webinar with... 

Amanda Fennell: You did a webinar. You didn't do a podcast. 

Zachary Faruque: No. Unfortunately not, no. 

Amanda Fennell: Just making sure. OK. 

Zachary Faruque: But I did it with a gentleman called Harry Dekker, who's the media director at Unilever. And he said he shared out to his team two questions that they have to ask before they embark on absolutely anything regarding data. The first question - would you do this with your mother's data? The second question - how would you explain this to the 9:00 news? And so if you can just at least get across that moral sense of right and protection and understanding, then, actually, everything else will fall into place. 

Amanda Fennell: I'm writing that down. I'm going to try that out and see if it works. This - so I think this is where it starts to become my personal favorite. It's close to a chess game at this point. We have data. We know how it moves. We know how it flows. We can track all this information. How do we make sure we're in alignment, whether a centralized or not centralized approach in how we're in alignment with everything? But it does feel like it's a chess move. And I often will look and try to understand, who are the beacons who are giving us this indication of where we - what we should be building for? So who is three, four steps ahead? So I think this dynamic of having to shoot for something that we aren't even sure if this is the way - the right thing or not becomes a little more complex. So, Marcin, I don't know if you feel like this also feels like chess. 

Marcin Swiety: A lot. And I think the basic thinking around that is you have to have some different scenarios, right? It's not about, you know, just preparing for this set of things that you have to accomplish. There's no roadmap that is just serialized. At some point, you have to think about paralyzing and also, you know, considering different paths for how you are going to deal with that, especially around things like new legislation being rolled out, new regulations and new markets. We have a boom right now. Like, last couple of years was actually very big on introducing new laws, new certifications, new regulations in different parts of the globe that you have to adhere very quickly. And you have to pivot sometimes to make sure that you are aligned. So paralyzing that and creating more of a, you know, what comes if I do this? What's the next step considering that we are meeting the time that new law will happen? So that's kind of a tree, like, thinking is - so I think the resemblance of what you mentioned with the chess game because when you are taking a step in chess, you have to consider those different responses - right? - those different things that might change that you then have control over. Some of these you can control with your chess move. But some of these - they are basically the reaction from either market, from legislation offices and so on. So you have to kind of be up front prepared to those different things that might happen. 

Amanda Fennell: But - so in chess, you have an opponent. So, well, actually, I'm going to do the same thing you did. So how many people play chess before I go with this? Awesome. Not enough. Not enough. No. But no, because there's a question here about my - I'm - OK - a security person. The last person I ever move on the board, the last piece I ever move is the queen because I feel like as soon as you start to move your queen, people can pick up pick up your style. They know what you're doing - but also because she can move so many different ways. So in a data privacy program, who's the queen? Don't say me. But I would say, like, your tool, your people, your process - Who's the most agile thing that's allowing you to adjust to those different things and regulations as they come up? 

Marcin Swiety: People - always people. And this isn't - like, you have to have a function. Of course, the ownership already alluded to that. But I think that's the point that actually allows you to have that flexibility and responsiveness to what's happening and also the productiveness of that. All the tech, all the processes that you can set up - they are good. They will help you get there. They will help you structure the things that you are going to need. But it's the people part of this that will make you have those capacity to those different moves and, you know, move very quickly in one side, move very strategically in other places. So I think that's the thing. 

Amanda Fennell: It's such an irony considering everyone always wants to say people are your biggest weaknesses, but we're actually saying this is your one strength you have on your side. They can move a lot faster and be more agile and pivot. But yeah. What do you think? 

Zachary Faruque: I mean, I totally agree with Marcin there. But I suppose to to to take another turn and give everyone else something to think about, I think there are some really simple things that we can do fast. Ultimately, the biggest risks are those - are the ones that we don't know. If we don't know about them, then we don't realize our exposure. And so just really starting with those records of processing activities, to use a GDPR data, but fundamentally just understanding all of the data that we have and we capture and process and how we process it is going to be your absolute best friend in understanding that risk. 

Amanda Fennell: That's actually an awesome segue because this sounds like a lot of things that we may have said might resonate. You might say, yeah, OK, that sounds right. I totally agree or disagree. And this sounds like stuff we've already covered. Let's get tactical. If you have to get tactical about building a program and you know that humans are at the center of it, what are some tactical takeaways that you would give in addition to that? 

Zachary Faruque: Yeah. So as well as the repairs and, of course, privacy impact assessments - those kinds of pieces can be really helpful when we're taking on new technologies, onboarding new vendors. I think training and awareness is really, really important. And it needs to be targeted, tactical training and awareness. It shouldn't just be a one-hour e-learning course that the entire firm or business is required to do every year. It needs to be relevant to their role and their job. If they're customer care, if they're client-facing, customer-facing, then we need to give them the knowledge and the power to be able to handle things like data subject requests. They need to know that they - that as a business, we have 30 days to respond to these and the importance of being transparent with consumers. So for me, I think the key piece is training and awareness, but very specific, tactical, not just a one-hour e-learning course. 

Amanda Fennell: OK, so I only think this is interesting because I do think it's an overlap with security because we're always judging each other, our companies, vendors, supply chain and everything, and so I'm going to give away the secret of how I normally start to judge an entity, and it's the same with privacy as it is with security. And it's normally, you know, when something really bad happens, you kind of start with responding to it, when you have no real good security program. Everyone says, oh, crap, something bad happened; we got to figure this out. So they start to get people in place and put a process in place, and then they start to figure out what they're going to do, how they're going to respond better. Then they work their maturity up a little bit and start to get better at detection. How do we detect it before it happens? Then they start on prevention. How do we make sure this nothing will ever happen? The similarities here for data privacy in a program, I think, that I look for is the same, which are the questions of, OK, first of all, is this a function that you have? Do you have this function at your business, in your corporation and so on? Because if you don't, that's already a bad sign. Is it just Marcin down here really enjoys data security and data privacy but doesn't actually do anything with it? He just enjoys doing this on the side in between his other Dungeons & Dragons games? That might not be a really strong program. 

Amanda Fennell: The second one - is there a person who's dedicated in terms of some kind of a role, chief risk officers, data privacy and privacy, so on. There has to be a person that's typically in charge of that function as well. And then you mentioned this earlier, but what is the long-term strategy? So I start to - like, to kick up the dust a little bit. It's great if you're, you know, checking those boxes. That's OK. That's something. But do you have something that you're trying to build for in the future? Are you just reacting? Or are you vision-casting something? Because I think that's where the onus really comes upon us. We have to be vision-casting something. Otherwise, we're just responding all the time, and we'll never win. You definitely don't win in chess like that. So that's one of the things that we focus on a lot. All right. Get tactical, Marcin. 

Marcin Swiety: Listen more. I think that's the - OK, that's the obvious one. But you have to find your peers. They're going to be your partners in crime. So sometimes this might be your sales team to understand, you know, what struggles our customers have. And it might not necessarily be the problem on the technology level. We might provide the right technology to solve the problems of securing the data, making sure that it's available, and it's doing the right thing for the entire purpose. The problem might be somewhere else, how we communicate that, how we make sure that the comfort level and understanding of our tools, technology, processes and the way we secure the data is in intact and with that, how we align with local regulations. So sometimes the discussion is something that is important. It might be your vendors. So, you know, involving your vendors in your concerns or your discussions, how you want to roll something out. That's also a very important, key thing - considering your own data program. 

Marcin Swiety: And also, the peers is something that is - sometimes, you know, lose sight of is our users. We have to listen to what our users want to do with the data. It's not just, you know, company to company. It's not just, you know, a business-to-business relationship. At the end, we are talking about personal data. And personal data, it's not something that is solely owned by a company. It's something of a real person at the end. So listening to that, shortening a distance there, creating very clear privacy policies, very clear intent of use - that's also something that how you will solicit the information from your peers because the end user is also the peers in that scheme. So I would - out some more. 

Amanda Fennell: So this sounds awesome. And I feel like I could go the high road right now, and we could wrap up, and this would be great, but I'm going to make everyone super uncomfortable up here. We're going to ask them questions they don't want me to ask. So here we go. So what upcoming or current regulation not yet implemented scares you the most from a business compliance perspective? Which ones will be the most difficult? 

Zachary Faruque: That's a great question. I think there are going to be different ones that are difficult for different reasons. A good one to think about for global businesses is the PDPL, which just dropped in Saudi Arabia. Key piece of that is that it's not just about fines. There is imprisonment if you get it wrong. No one wants to go to prison in Saudi Arabia. No one wants to go to prison anywhere. But - so I think... 

Amanda Fennell: But it wasn't my choices... 

Zachary Faruque: Yeah. 

Amanda Fennell: OK. 

Zachary Faruque: Yeah, that's definitely on the bottom of the list for me. So I think that's going to be a key one. And I think that's going to raise awareness just generally within the Middle Eastern region. From a European perspective, I think the privacy regulation when it comes out because that will probably look to address some of the technologies that are looking to replace third-party cookies at the moment, things like server-side tagging while I'm no longer dropping information on the device. So it's fine for me to track you wherever I want to track you. So that's probably going to be a key one in the European area. And then for U.S., obviously, the federal law that will at some point hit and hopefully make things easier in some respects, rather than it being state by state at the moment. 

Amanda Fennell: You think that will actually happen? 

Zachary Faruque: Yes. Yeah. Within the next two or three years. 

Amanda Fennell: I totally don't at all. Like, I think that's going to be such a battle. I think every state wants to have very different perspectives. But we're going to see. And we're can come back next year. 

Zachary Faruque: Yeah. 

Amanda Fennell: We'll do this one again. But all right. So, Marcin, you're going to really not be - I know. You know this is coming. OK. Well, you don't know what it is, but - so data privacy trending in different directions between the U.K. and the EU. Which side would you rather be in, and what are the risk or benefits of this? 


Marcin Swiety: That's a hard one. That's a spicy one... 

Amanda Fennell: This is the part where it's so - it literally says spicy meatball, like, right there. Yeah, I know. 

Marcin Swiety: I think I would side with U.K. I have some kind of love to how U.K.'s operated for a couple of years back with ICO with the way it's being more proactive with some of the material, some of the ways we rolled out things in the U.K. for last couple of years. EU is a little bit different. There is a lot of different jurisdictions that had to fold in to GDPR, jump on the train. U.K. was kind of a more, I think, proactive of some of the things. And I love, like, in January, the new information commissioner in ICO has said something very, very resonating with me. It was on one end we need to, like you mentioned, treat human beings as human beings and make sure that we are really fulfilling the rights of every person. But at the same time, we are collecting that data and using that data to achieve some certain - not only business initiatives and business outcomes, but we also need to advance in innovation. We have to be more creative. 

Marcin Swiety: We have to - that actually - that data-driven innovation pushes us in the technology and industry forward, right? So we have to acknowledge that this is also used for a very, very good reason. So we have to kind of - not to find a balance because not the right term - there's no compromise when it comes down to the privacy. But the proper set up of this, I think that resonates with me a lot, the way he put it in the forum. There is human beings. There is also our data-driven innovation. They all come together to achieve the business outcomes. 

Amanda Fennell: It's valid. It's valid. So we do typically have, you know, a wrap up of some of our main points that we hit on. So I'll go over a couple in case anybody has taken notes and they're going to deal with the test later that we're going to give you. But one of them that I think hits home is that consumer expectations have changed. We need to try to keep pace with that. So today's consumers have grown up in a very different age of data privacy. You can't buy an engagement ring online without getting bombarded by ads, probably an issue there. It's something about how our ecosystem views the data privacy could be a problem, but you need to treat users as humans, which is an amazing idea, but definitely one that we should take as the main one here. And then this moral humble approach to the program that you're building I think, is a really cool one. We should start getting T-shirts made. Would you do this with your mother's data? This is a great question. People are the queen on the chessboard. They're the most agile. They can pivot the most. 

Amanda Fennell: And then we also typically will end with, like, a closing quote. And so I found two that I think were good because I was like, OK, I'm American. Others are not. So I'll say Marlon Brando had a great quote, somewhat of an actor. Some people may have heard of him. Privacy is not something that I'm merely entitled to. It's an absolute prerequisite. It sounds awesome in theory, and I think that's how human beings feel. But somewhere we're losing this. And I don't think it's translating to the technology. I think that's one thing that we need to start making that translation better because we know that, we feel that it's an inherent right. That's not translating into the way that we're doing these user experiences, and it's not translating to how we're collecting this data and what we do with it. 

Amanda Fennell: And then the second one, this one is actually from Edmund Burke, British philosopher, statesman. Better to be despised for to anxious an apprehension than being ruined by too confident in your security. Takes a minute. That sounds awesome. I like that. I'd rather be seen as somebody who's apprehensive and has a lot of, you know, concerns about everything than to be overly confident. So I hope that that's super helpful. 

Amanda Fennell: Thanks for digging into these topics with us today. We hope you got some valuable insights from the episode. Please share your comments. Give us a rating. We'd love to hear from you. 


Unidentified Person: "Security Sandbox" is produced by Relativity. Our theme music was created by Monarch. Find us wherever you listen to your podcast or visit for more episodes.