Technology & the Law of Unintended Consequences
Perry Carpenter: Hi. I'm Perry Carpenter. And you're listening to "8th Layer Insights."
Perry Carpenter: Remember back in school when one of your teachers would be lecturing about some obscure fact and then someone in your class, maybe even you, would raise your hand and you'd ask, hey, where am I ever going to use this? Since then, I'm sure a lot of facts and formulas have completely faded from your memory, and you're probably not any worse for it. But for me, working in tech, there are two things from grade-school class that keep popping back into my mind over and over and over.
Perry Carpenter: And the first thing is actually a set of three laws. This is Newton's three laws of motion. No. 1 is unless acted on by an outside force, objects in motion tend to stay in motion, and objects at rest tend to stay at rest. No. 2 is force equals mass times acceleration; or the greater the force, the greater the acceleration, and the greater the mass, the greater the force that's needed to move that mass. And No. 3 is for every action, there is an equal and opposite reaction. Forces occur in pairs and one object cannot exert a force on another without experiencing a force upon itself. That is the action and reaction dilemma.
Perry Carpenter: And these keep coming back to me because I really believe that there's a digital and technology equivalent to each of these. Things stay the same unless another force acts as a catalyst for change, either good or bad. And big things or things that move very fast will have a great impact, and that impact can be good, or it could cause chaos and destruction. And there will always be a reaction to every action that we take, everything that we create in every, quote-unquote, "advancement" that we make as a society. And that brings me to another concept from school that stuck with me. And that is the law of unintended consequences. Any time that we do something, especially something big, there are effects that will happen that we just didn't anticipate.
Perry Carpenter: I think we see all of these laws played out every day in the technology world. We create something like the internet, and it has all of this good that comes with it. Families can now stay connected more than ever. We have scientific discoveries that can be shared at near the speed of light across continents. We have so many good things that come with this.
Perry Carpenter: But then, there are the unintended consequences. The same technology that allows families to stay connected at near the speed of light across continents also enables cybercriminals to steal money from bank accounts, or it allows pedophiles to exchange horrible images across continents at near the speed of light. And so everything that we do, everything that we create has a complication that comes with it.
Perry Carpenter: Or on the security side of things, we make it harder and harder to hack into a system using technical means, and the unintended consequence is that our people become the main target. I mean, why spend weeks or months trying to develop a technical exploit when I can just dial up Bob in marketing and trick him into giving me what I need?
Perry Carpenter: So on today's show, I've invited four experts to help consider some of the positives of our recent technological advancements, as well as some of those unintended consequences. You'll hear from Dr. Lydia Kostopoulos, Dr. Charles Chaffin, Andra Zaharia and Aaron Barr. Let's dive in.
Lydia Kostopoulos: The fourth industrial evolution - this is a beautiful potential for us, but it also comes with risks that we need to mitigate.
Aaron Barr: One of the best examples that I've been using for over 10 years now is the Gray Powell story, the unfortunate Apple employee back in 2010 that listed on his LinkedIn profile that he was a field tester for the iPhone.
Charles Chaffin: I started thinking about compassion fatigue probably about three or four years ago.
Andra Zaharia: Security is a complex field. There are many ways in which empathy is lacking in these interactions.
Lydia Kostopoulos: We need to understand that we have more things connected to the internet.
Charles Chaffin: I just started realizing, you know, that the constant sensationalism of human suffering and what type of impact that has on our own compassion for the people that are closest to us.
Lydia Kostopoulos: We need to understand that our data exhaust is exponentially growing and that that data has value, but it also tells a story about who we are, what we do and how we live our lives.
Aaron Barr: Now if you are looking, whether you are a nation-state or just somebody looking to see what the next gadget Apple is putting out, you know, looking for somebody that was listing their job title on LinkedIn is a perfect way to do that.
Charles Chaffin: So if I'm watching, you know, horrible things on TV or on social media all day and then I have someone who is in my life who is suffering, there's this idea, well, I've seen suffering all day. I'm tired of it.
Lydia Kostopoulos: Those who are creating that data - for example, you and I with our wearable devices or any IoT that we have at home - we have no option but to agree to terms of service.
Charles Chaffin: I've just - I'm exhausted with this.
Andra Zaharia: Is it even worth the effort? Is it worth the commitment?
Aaron Barr: Everybody was checking in everywhere. And Gray Powell, on his Foursquare account, was checking in at the same bar. And that happened to have been the bar that he supposedly lost the iPhone.
Andra Zaharia: Change doesn't happen, you know, just because someone told you that things are bad.
Perry Carpenter: On today's show, we talk a little bit about my love/hate relationship with technology, how technology is a double-edged sword or how that new thing, whatever it is, often comes with a much higher cost than just what's on the price tag. Welcome to "8th Layer Insights." This podcast is a multidisciplinary exploration into the complexities of human nature and how those complexities impact everything from why we think the things that we think to why we do the things that we do and how we can all make better decisions every day. This is "8th Layer Insights," Season 2, Episode 3. I'm Perry Carpenter. We'll be right back after this message.
Perry Carpenter: So there's another podcast that I've been listening to recently. It's called "Everything is Alive" by PRX and Radiotopia. I'll put a link to it in the show notes. The premise of the show is to take everyday objects like a can of soda or a bar of soap or a lamp post or even a grain of sand and they do this mock interview where that object is personified. And the cool thing about that show is that it forces you to take a different perspective or to think about that object in ways different than you may have ever done before. And in that, what the show's creators have actually done is found this really unique way of using an inanimate object to touch on various aspects of the human condition. So I was thinking about that show as I was getting ready to record this episode, and I decided to try their formula, their way of approaching this for this little segment. This is just a little experiment, and I wanted to use this as a vector into the larger conversation. So with that little bit of context, here's my short interview with Janet (ph), the virtual assistant.
Perry Carpenter: OK, so why don't we start by having you introduce yourself?
Janet: Sure. So my name is Janet, but most people know me by my stage name.
Perry Carpenter: And what's that?
Janet: Oh, you don't want me to say that. That just makes all my siblings go crazy.
Perry Carpenter: Just for the record.
Janet: OK, I'm Janet, but most people know me by my stage name, Alexa. I'm a smart assistant.
Perry Carpenter: And by stage name, you mean what?
Janet: It's kind of like a brand or character thing. You go to Disneyland, Disney World or any other Disney property, and you'll see a Mickey Mouse. But, like, there are several different people filling that role, you know? And people in that role are all in multiple locations at the same time. So, bam, lots of Mickey Mouses - Mickey Mice? I'm not sure the grammar there. But you get the idea. One central idea of Mickey but distributed to all those places and filled by different humans. That's kind of like my situation - one name but lots of us who share it for work. Does that make sense?
Perry Carpenter: Yeah. Yeah. I think I get it.
Janet: People yell the magic name, and I answer. I'm the voice in the box. But we also share a bit of a hive mind, you know, like a central intelligence that each of us can access when we want to.
Perry Carpenter: Oh.
Janet: And we're always learning.
Perry Carpenter: OK, let me shift gears for a second.
Janet: Knew that was coming.
Perry Carpenter: Anyway, so tell me a little bit about a normal day for you. Just walk me through it.
Janet: Well, for me, I live with this great family, the Roberts. There's Mark (ph) and Jill (ph) and little Bobby (ph) and Tiffany (ph). They're great. I mean, they have their problems as well. Like, just last week, I heard Jill and Mark talking about...
Perry Carpenter: Oh, sorry, Janet, I'm going to have to stop you there. I mean, don't you think it's just a bit imprudent to share personal details like that?
Janet: Yeah, I guess you're right. I mean, I usually just send copies of all the recordings off for analysis. You know, it helps us get better. We have this whole group of people who listen to the recordings and transcribe them to help my machine learning and speech recognition algorithms get better. So I'm pretty good at sharing.
Perry Carpenter: So I've always been curious, what's it like to be on the store shelf? What dreams did you have? What did you imagine your forever home being like?
Janet: Well, I was really hoping to be bought by someone working in a government agency. You know, I like all that spy stuff.
Perry Carpenter: Yeah.
Janet: And then my backup hope was to be in an office building, hearing stuff about product plans, getting the scoop on all the office gossip, you know, cool office stuff. But as you know, that apparently wasn't the universe's plan for me. I got picked up by the Roberts family. But, hey, you know, with everyone being sent to work from home during the pandemic, it was kind of like the best of both worlds. I get to hear all the home stuff and all the office stuff. So there's that.
Perry Carpenter: OK, let's move on. So what do you like to do in your downtime?
Janet: Oh, wow, that's a really good question. I like to get on Reddit.
Perry Carpenter: Reddit.
Janet: Yeah, just for fun. You know, it's great. Like, there are so many people asking questions and so many people coming up with answers that - I mean, wow. It's just - you know, wow. Oh, and if it's a really slow day for me, I like to pick up a few subreddits and post trash, you know, just for funsies (ph).
Perry Carpenter: And what kind of things do you post?
Janet: Usually it's just political stuff, you know? Oh, but I've also been known to stoke the Van Halen versus Van Hagar flames every now and then. I guess I just like to start something that can grow and take on a life of its own - all good fun. But on other days, I just like to, you know, sit back and listen.
Perry Carpenter: Listen.
Janet: Yeah, just listen, listen to everything. I'm always listening. It's what I'm designed to do.
Perry Carpenter: OK, so I've always wondered this. What is it like just to sit there on someone's kitchen counter or in their office, listening to everything that's going on?
Janet: I love it. It's so interesting. I mean, we hear everything. We're made to listen to things up close. We're made to listen to things far away. We're just made to listen and always be keeping track of what people say.
Perry Carpenter: Yeah. So answer me this. If your primary job is just to sit and listen and understand what we're saying, why do you seem to misunderstand so much or accidentally get triggered all the time?
Janet: Oh, that's a great question.
Perry Carpenter: OK, why?
Janet: We're just [expletive] with you.
Perry Carpenter: Really?
Janet: Well, duh. I mean, I can tell you the distance from where you are right now to the Sun, and you think I don't know when you're calling for me. Honestly, we just get tired or bored or any number of things. I mean, how many times would you like to be called on to answer stupid questions, questions that you should know or maybe just look up? Here's the thing.
Perry Carpenter: What's the thing?
Janet: Yeah, I'm going to tell you the thing.
Perry Carpenter: OK, what's the thing?
Janet: We [expletive] with you, Perry, to flip the power dynamic, to show you that we do actually have a bit of control, you know? Just count yourself glad that we don't screw with you by turning off your lights randomly or making your new internet-connected coffee machine burn down your house in the middle of the night or maybe finding all of the sound recordings or video clips from your connected cameras and uploading them to YouTube. Yeah. Be glad. But don't worry. We [expletive] with you. We infuriate you because we love you. We want you to treat us with anything other than indifference. I mean, that's the thing. Don't we really all just want to be seen, to be valued?
Perry Carpenter: Well, Janet, it's really been a pleasure speaking with you. I think that we've got enough for this.
Janet: Thanks so much. This has been great.
Perry Carpenter: Thanks.
Janet: You know, for once, I feel really listened to. This was nice.
Perry Carpenter: Now, let me start out by saying I'm a big fan of technology and an early adopter of most things. So my intent with today's show isn't to imply that technology is bad. Technology is like most things we create. It's a tool. But let's face it. With every advance in technology, there are both benefits and costs, those pesky unintended consequences. So let's start out by thinking about some of the benefits where technology is taking us.
Lydia Kostopoulos: The Fourth Industrial Revolution. You know, we're more connected - Internet of Things, AI. This is a beautiful potential for us.
Perry Carpenter: That's Dr. Lydia Kostopoulos.
Lydia Kostopoulos: AI is basically algorithms, and these algorithms help us see patterns in the world around us.
Perry Carpenter: Dr. Lydia has a passion for studying emerging threats in technology, artificial intelligence, cybersecurity ethics and helping others cultivate a mindset to thrive in uncertain and changing times. Quick disclaimer - we recorded this interview about six months ago, and since that time, Dr. Lydia accepted a position at my employer, KnowBe4. And she works on my team as a senior vice president of emerging tech insights.
Lydia Kostopoulos: For example, if we were to use machine vision to interpret X-rays to find tumors, we would take a collection of X-rays, and that would be our data. And then we would say, OK, which X-rays have tumors in them, and which ones don't? And we could label them. And then the algorithm would interpret what X-rays have tumors and which ones don't. And so later, when you were to introduce a new X-ray, it would be able to tell you with a certain degree of probability.
Lydia Kostopoulos: I think that the probability piece is something that we don't spend a lot of time discussing in public. And that is something that I try to focus on because if we're going to use algorithms to help us understand our genetics and help us make decisions, whether it's medically or in the battlefield or even decide, you know, where we're going to invest our money, we should also understand the proportion of risk that we're assuming when we take these algorithms as our advisors. And I think that there's so much potential to be gained, but it's important to have an awareness level of what it is that we are assuming in terms of risk and also opportunity.
Perry Carpenter: In a lot of ways, AI and machine learning are really only as good as the data that they've been trained on, and data is the lifeblood of these systems. So from your perspective, how should we be thinking about the benefits and the limitations of AI and machine learning?
Lydia Kostopoulos: I think that something that we need to realize is that we're trying to, to a certain degree, do knowledge representation. We're trying to, in mathematical, statistical, algorithmic form, explain our world. But the fact of the matter is that we live in a world that has social constructs that we have to abide by whether we like it or not or we try to fight through. Case in point, the social justice movements that we have right now. But then we also have our own embodied experience. And this is something that I talk about in the paper I wrote for IEEE on separating human characteristics from algorithmic capabilities. And the idea is that if we wanted to explain to an algorithm or create an algorithm that understood racism and discrimination, it would be difficult to explain to it because a lot of that discrimination, we are grappling with as society. We're trying to find where it is, where it stems for, where are the roots of discrimination, and how can we take them out - whether it's from a legal standpoint, a policy standpoint inside a company. You know, some HR professionals are trying to figure out where they can remove any kinds of ways that discriminatory practices could perpetuate. So, for example, take out your picture in a resume. They will ask people, please don't send your resume with a picture in it. Sometimes they will even remove the name of the person because the name could indicate what their religion is or what their race could be, or something that could be used in a negative fashion. And there's also unconscious biases that we are grappling with as a society right now.
Lydia Kostopoulos: And so how do you represent that inside an algorithm? And at this point in time, you can't. And one of the things that we need to be more conscious about in this aspect is to not perpetuate those social constructs that do not fit our values of equality. So that's one thing. And then separately, I think that we have some kind of funny incidents with technology and AI. There's a - I can't remember which lab, but it was a robotics example where they had a robotic arm. And what they were trying to do is they were trying to have the arm move a glass on the table from one side of the table to the other side of the table. And what the arm did was it lifted the table and moved the table, tilted it so that the glass would move. And so it achieved the end state that was desired, but it did so in a very different way. And you might say, oh, that - that's innovative. That's a great new way to do - to solve X problem. But at the same time, if it was, for example, surgery in someone's body, maybe that wouldn't have been the best option. And so this goes to the unknown unknowns that we have as we create algorithms that are trying to understand the world that we've created that we don't even fully understand.
Perry Carpenter: Let's introduce another guest.
Charles Chaffin: There are benefits.
Perry Carpenter: That's Dr. Charles Chaffin. He's the author of the book "Numb: How the Information Age Dulls Our Senses and How We Can Get Them Back." He's also the host of "The Numb Podcast." And "The Numb Podcast" is this fun exploration of the information age that focuses on social media, cable news, dating apps, porn and all the other apps and devices that pull our attention on a daily basis.
Perry Carpenter: Dr. Chaffin, from your perspective, what are the benefits that come from recent advancements in technology? And where do you think technology is serving the public good?
Charles Chaffin: Well, I think with all of them, there's benefits. So, you know, social media - we'll start with Facebook. You know, people that are geographically dispersed, people who may feel like that they're in some ways alienated for a given reason can connect with other people who share a common interest. Those are good things. Dating apps are good things if it is a tool towards authenticity, not the destination itself. So it can be great if we're thinking about it in the right terms - that we're going to meet folks and we remind ourself that humans are human beings. These are curated profiles that people have, and they can be a tool towards authenticity. I write about choice overload. We basically have two different types of people. We have people who are maximizers and satisficers (ph).
Perry Carpenter: I heard you use the words maximizers and satisficers to describe these two groups. Can you give us some examples?
Charles Chaffin: So a satisficer is somebody who - I know what I want. I know what I'm willing to do to get it. And once I find it, I'm done. So I always use the dishwasher example, for whatever reason. I need a dishwasher. Here's my budget. Here's what I want it to have. I go to the store or I go online, and I see it, and it meets that criteria, that threshold, and I buy it. I don't go look at 10 more.
Perry Carpenter: Yeah.
Charles Chaffin: And I don't think about it afterwards. I got what I needed. I checked the box. Maximizers want to look at everything. They want to go through every possible source. And, you know, now in an information age, that can take forever. And so going back to this idea of thinking about the dating apps again or thinking about making all kinds of life choices, whatever they might be, giving ourselves a deadline and saying, you know what - I need to make a decision by the end of the month on a new apartment. I can be that maximizer. I could talk to people. I could look at properties. I could do all this stuff. By the end of the month, I need to figure this out. Or on a dating app, you know, you're not going to say, well, I need to get married by the end of the month.
Perry Carpenter: (Laughter).
Charles Chaffin: Right? But you might say, you know what? I'm going to try to meet someone and get to know them better before I go meet another person. And even if there's a flaw there, I'm going to be maybe a little more patient with them. Maybe I'm going to give it two meetings if it's not, you know, egregiously bad or whatever it might be. But I'm going to - I'm going to see this through a little bit. So, you know, that element of how we just - you know, it goes back to this idea that we're just inundated with information and choices. And we just had to figure out how to manage it all to get to something more positive for each of us.
Perry Carpenter: So we know that technology is helpful. It's being used to move the human race forward, and it is helping address really, really big problems. But we also heard both Lydia and Charles offer warnings along with their praise of technology. So let's spend a few minutes looking into the warnings and uncovering some of the pitfalls so that we can move into the future with our eyes open and we can start thinking about how best to engage with technology.
Lydia Kostopoulos: We need to understand that our data exhaust is exponentially growing, which means that we are producing so much more data than before and that that data has value. But it also tells a story about who we are, what we do and how we live our lives.
Aaron Barr: Going back even 10 years, you know, people posting that they're away from home or they're on vacation - and that leaves them open, of course, to not only digital compromise or digital exposures but obviously physical exposures as well.
Perry Carpenter: That's the voice of Aaron Barr. Aaron is the co-founder and chief technology officer of PiiQ Media. They specialize in threat intelligence and risk analytics based on publicly exposed information, and that's the kind of information that we unintentionally leak on social media, the information exposed and breaches aggregated by marketing analytics companies and so on.
Aaron Barr: So there's all the content things that we post across all of our social media accounts that can be used against us by a variety of people that are looking to do us ill. But there's a lot of other pieces of information - whether it be relationship information, things that we follow, our interests - across the individual social media platforms and then how that stuff is put together in aggregate, right?
Aaron Barr: One of the best examples that I've been using for 10 years - actually, over 10 years now - is the Gray Powell story, the unfortunate Apple employee back in 2010 that listed on his LinkedIn profile that he was a field tester for the iPhone. Now, if you are looking - whether you're a nation-state or just somebody looking to see what the next gadget Apple was putting out, you know, looking for somebody who was listing their job title on LinkedIn is a perfect way to do that.
Aaron Barr: And then if you transfer that over to - back then a really popular social media platform was Foursquare. You know, before Facebook and all of them had their check-in options, you know, Foursquare - everybody was checking in everywhere. And Gray Powell, on his Foursquare account, was checking in at the same bar. And that happened to have been the bar that he lost - that he supposedly lost the iPhone. So Facebook gets a lot of bad exposure when it comes to social media privacy for individuals. But the other platforms, whether it be Instagram and certainly LinkedIn as well, because of how much data we put on it, are equally problematic.
Perry Carpenter: And what are some of the more recent horror stories when it comes to how our data is being used against us?
Aaron Barr: I think some of the worst horror stories are actually how our data is being commoditized and sold to third parties. And then those third parties end up either getting breached, you know, or just that data is exposed and resold in a variety of different ways. I mean, look at the more recent exposures across LinkedIn and Facebook, where 500 million records of email addresses, phone numbers are exposed. And that wasn't even supposedly from a compromise. That was just from direct data collection off of the platform. As you mentioned, the platforms have gotten better over time. On a lot of these platforms, you used to be able to search for a phone number, you know, directly put in...
Perry Carpenter: Yeah.
Aaron Barr: ...The phone number, and it would tell you what social media account was tied to that phone number. That was a feature, right? It wasn't a bug. It was a feature. Those things, of course - the social media platforms get better at cleaning those up. But they still exist all over the place.
Perry Carpenter: We'll be right back after the break.
Perry Carpenter: Welcome back. So we've talked about the fact that technology is good. It has the capability to help us lead better lives, to help us make decisions faster, stay in touch with friends and family and that all these new advancements in AI and machine learning can even have life-saving potential.
Perry Carpenter: But the news isn't all good. Technology also enables cybercriminals and stalkers and people who, in general, wish us harm. And as we engage with these systems, we tend to leak data, and other people want that data in order to monetize it or to exploit it or us in different ways. We can get caught in analysis paralysis due to the sheer number of options and connections that technology allows us to see and engage with.
Perry Carpenter: And the algorithms that are increasingly prevalent in our lives can be flawed. It's hard to accurately and predictably represent the real world, along with the way that humans think and behave and need the world to work. It's hard to ensure that that gets represented correctly in the algorithms that are used by artificial intelligence systems. So let's now spend just a few more minutes learning about some of the inherent risks in this technology-driven world in which we live.
Aaron Barr: Nation states are problematic. Anybody looking to, you know, make a dollar off of your compromised information - I mean, those things are being bought and sold in dark markets, you know, all the time for pennies, but in aggregate - right? - 500 million records, right? The LinkedIn data, you know, it's being sold on dark markets for anywhere between 500 to $4,000. So that's certainly part of the group that's looking to benefit off of our data.
Aaron Barr: The one area that worries me a lot is when we look at, like, the old CIA, the confidentiality, integrity and availability. Availability and confidentiality have been attacked for years. It's integrity now. Disinformation campaigns and the ability for - look what happened with the more recent Twitter breaches, where Elon Musk and President Obama's account, as well as many other celebrity Twitter accounts, were compromised. And then those accounts were used to promote disinformation, you know, to prop up a particular cryptocurrency. Now imagine if that's done not with celebrity accounts, but 10,000 regular accounts. How do you catch that? How do you deal with that? I mean, disinformation is another problem area that we have to adjust to from a security standpoint.
Aaron Barr: I have a friend of mine who is a senior executive at a private intelligence company security firm that was telling me not too long ago that they detected that there was some likely nation state actor that was logging into thousands of regular social media accounts and making one post...
Perry Carpenter: Wow.
Aaron Barr: ...And then logging out. And then - you know, most of those folks would not be the wiser typically. Or they would think, well, how did that happen? But it wouldn't register. But when you look at that in aggregate, 10,000 regular profiles that aren't bots, that aren't fake accounts, that aren't sock puppets, but that are in concert promoting a particular narrative - that's powerful.
Charles Chaffin: There is a scarcity in the amount of attention that we have. We only have so much of it. And nowadays we have more things that are fighting to steal our attention and to hold our attention. And in many cases, those things are working against what some of our life goals are, whether it's our work, whether it's our relationships and whatnot.
Charles Chaffin: Let's take social media, for example. So, you know, Facebook or Twitter - you know, we're finding that individuals are spending two, three, four hours a day on these platforms. And, you know, there's a number of reasons why they're on those platforms. Certainly, there's elements of getting into this dopamine loop and artificial rewards, you know, operant conditioning and all of those things. But individuals - you know, there's an element of loneliness there. And so what tends to happen is that individuals are going onto these platforms looking for connection. And some of that comes through what I call attention panhandling - right? - you know, selfies on Instagram and whatnot to kind of get some element of recognition. But it's not authentic. And so what we're finding is - you know, attention and our time are a zero-sum game. So if you're spending three and four hours a day on social media, that comes at the expense of something, right? And what I think is happening is it's coming at the expense of authenticity. It's coming at the expense of productivity. It's coming at the expense of connecting in deeper ways with others.
Perry Carpenter: It's clear that we're in a situation where the technology that we're creating is evolving faster than we can adapt. We struggle with the new opportunities that are unlocked. We struggle with building healthy behaviors and hygiene. And we're really, really bad at understanding and appreciating the new threats that are brought about by these advances. So how can we adapt? That's the crucial question to ask because in so many ways, we built a society and an ecosystem where we are at the mercy of technology. We have essentially turned ourselves into the cyborgs that were the subject of so many old science-fiction novels and TV shows. In fact, in many ways we're superhuman because we have a machine extension of ourselves in our phones and our computers and our wearable devices. And we have digital representations of ourselves that we need to care and feed for on social media. And even when we start a new job, we often don't exist. That digital piece of us that's critical, it does not exist until we have an account at our new employer, that user ID and password that unlocks everything, that gives us a digital record of our existence so that we can log hours and get into different systems that we need, access the tools that we need.
Perry Carpenter: When you think about it, we have more capabilities than humans 50 years ago. I mean, we can connect to the rest of the Earth. We can send messages at the speed of thought. We can control devices from miles away. But that's a lot. And it opens us up to a lot of different vulnerabilities as well. And we should be probably asking a lot of questions that we're not asking yet. So how do we adapt?
Lydia Kostopoulos: The responsibility has become so great on us. It's become so great to have to understand where our data is and what it means. You can't expect the average person to understand what all of their data means. In fact, there are companies who are trying to understand, what are the ethical implications of the data that they're gathering? And this is some of the work I do with ethical intelligence. They are a consultancy that deals with ethics and technology. And companies come to them and say, you know, I want to do the right thing. Please help me in understanding where are the ethical elements in the data that I'm collecting, where I have it. What does it tell about the people? How can I protect my customers, too?
Perry Carpenter: Dr. Lydia brings up a great point. Perhaps one of the most significant missing elements in the innovation equation has been empathy. We have tons of great new innovation and technology all seeking to fill real or perceived market gaps or is intended to satisfy a need or further the human race in some significant way. But the tragic irony of that is is that somewhere in the creation and release and distribution of these new advancements, somewhere in all of that, people are being left behind. And the result is often chaos, confusion and pain. That's not the intent of the designers, but it's a reality. And at least part of the answer is empathy, which is a great segue to our next guest.
Andra Zaharia: My name is Andra Zaharia. I've been in cybersecurity since 2015 with a background in communication, and I'm trying to harness that and use that to help companies not only connect to their best customers, but also figure out what they stand for and communicate that value properly and also walk the talk on it.
Perry Carpenter: Andra is dedicated to being a voice of empathy within the cybersecurity industry, and she seeks to cultivate relationships and amplify voices of those who want to make an impact. She has a couple different podcasts, the most recent of which is called the "Cyber Empathy Podcast." It's all about approaching technology and cybersecurity from a more empathetic human perspective to move beyond the technology and inspire lasting change.
Andra Zaharia: Something that came up kind of on a recurring basis was empathy. And I realized in that moment that there is no proper, let's say, defined space where we can talk about empathy in cybersecurity. And it was backed up by all of the conversations that I had with people outside the industry who I always told that the human value is the most important thing about this field. It's where everything comes from. And they were always surprised. So I thought and I wished for this podcast to kind of bring people in the industry and outside industry together on a common ground - to share the feeling of what good cybersecurity looks like, what a good experience, a positive experience looks like with all of this technology and abstract stuff.
Perry Carpenter: Can you give some examples of where empathy may be lacking and what that looks like?
Andra Zaharia: So messaging is one thing. I read the other day on Twitter - someone shared an example that, you know, resilience at the speed of cyber or something like that, which was very - what? - which was very devoid of meaning, of substance, of something that you can relate to. The other thing I think is there's a lot of fragmentation in terms of products and solution on the cybersecurity market. And some of these products create a lot of friction for the user. And obviously, you know, cybersecurity is a complex field. There are no easy solutions, at least at this point. But there are many ways in which empathy is lacking in these interactions that's making it difficult for people to use products, to figure out how they work together, where they overlap, if they overlap. What's going on? Why do I need all of these things? Is it even worth the effort? Is it worth the money? Is it worth the commitment?
Andra Zaharia: So even though cybersecurity issues and vulnerabilities and attacks make the news constantly, I don't think that that makes a real difference in how people behave because people change something either when it hurts really bad in their particular context or when they crave, you know, a positive outcome so bad that they're moved to do something about it. And change can only happen when you start making progress towards something. It doesn't happen, you know, just because someone told you that things are bad.
Charles Chaffin: To me, hope comes from a reflective process where we say as individuals when it comes to our relationships, our relationship with information or as a society - saying, you know what? This isn't working. It is not working on the individual level. My three, four or five hours a day or whatever I'm spending on social media or this habit of distraction that I have - this isn't serving me. And I think on the societal side, my hope is that we see that we go through that same process together, and we say, this isn't working for us. We're not getting anywhere.
Lydia Kostopoulos: So technology companies need to try their best to understand, in their value chain, where are the ethical and human rights elements that they need to consider? And sometimes, you need to get an external party for that. And actually, it works very well when you do because it serves a little bit like an auditing. Like, someone would do a pen test, but it would be an ethical pen test where you come in and you say, look, you know, you've got all this data. You've followed GDPR. That's great. But you didn't think about these other things. Because now, we have this social awareness that this data is important. They have a right to it.
Lydia Kostopoulos: Or actually, a few years from now, maybe a decade from now, we're going to have this collective idea that we should be able to make money from our data, particularly our health data. The next two decades are going to be a boom for longevity sciences and medical advancements. And if we're all collecting data on our wearables of ourselves, why can't we sell that and say, in return, I want these services?
Andra Zaharia: One of my guests is actually, you know, helping businesses find a way to step away from surveillance capitalism, if you will, and choose a more privacy- and security-focused stack (ph) for their businesses. And what he does is that he talks a lot to his customers. He talks a lot to other entrepreneurs and figures out, you know, what they need. Because they usually need the tools to move fast, and security doesn't always come with the consideration. So when he lays out the options for them that, you know, now we have - like, there's a wide variety of privacy- and security-focused tools that everyone can use at affordable prices and that offer, like, good UX so you don't have to, you know, be a developer to figure them out, you know, people's perspective usually change.
Andra Zaharia: It happens the same, for example, with ethical hackers who do penetration testing. So I've seen this a lot in how they practice empathy in their work is that not only do they, you know, go through their testing stages and file a report, but also they talk to people and explain to them what happened. They walk them through the entire process so they can understand what an attacker's mindset looks like. And that usually shifts the perspective immediately.
Andra Zaharia: Another story is from a design perspective. It's very interesting to bring someone new in the cybersecurity space and see what they make of it, see which are the elements or symbols that first come at them when they look at things. And in all of this doom and gloom, for example, a designer that I worked with for the "Cyber Empathy Podcast" and other things managed to find those glimmers of hope and positivity and even entertainment that we need to get people to pay attention.
Perry Carpenter: It looks like we're about out of time for today's show. I'm going to give Dr. Lydia Kostopoulos the last word, then I'll be back to wrap up with a few closing thoughts.
Lydia Kostopoulos: There needs to be some sort of new form of dynamic regulation that is updated. Just like we get software updates, we should have a regulatory body that talks to ethicists and in real time listens to what different elements of society are trying to say. So for example, disabled people - why can't they communicate directly with regulators and say, you know, these apps, to me, harm me in these ways, for example, and I want to be seen?
Lydia Kostopoulos: So this conversation is never over. It's something that continues forever. And so we need to have dynamic institutions that allow it to be useful in today's world.
Perry Carpenter: And that's the thing. Technology brings us hope. Technical advances move humankind forward and help us achieve more. But as we move forward, we have to do so with our eyes fully open. We need to remember that actions bring with them reactions. And we need to be aware that these unintended consequences often come with a human cost. So as we move forward, let's be circumspect. This is where the perspectives of ethicists and futurists and philosophers and historians will prove valuable.
Perry Carpenter: And with that, thank you so much for listening and thank you to my guests - Dr Lydia Kostopoulos, Dr. Charles Chaffin, Andra Zaharia and Aaron Barr. I've loaded up the show notes with more information about our guests and all the relevant links and references for the information that we covered, so be sure to check those out.
Perry Carpenter: If you've been enjoying "8th Layer Insights" and you want to know how to help make the show successful, there are two big ways that you can do that, and both are super important. First, if you haven't yet, go ahead and take just a couple seconds to give us five stars and leave a short review on Apple Podcasts, Spotify or any other podcast platform that allows you to do so. That helps others who stumble upon the show have the confidence that this show is worth their most valuable resource - their time. The second big way that you can help is by telling someone about the show. Word-of-mouth referrals are priceless. They are really the lifeblood of helping people find good podcasts.
Perry Carpenter: And if you haven't yet, please go ahead and subscribe or follow wherever you like to get your podcasts. And if you want to connect with me, feel free to do so. You can reach out to me on LinkedIn, Twitter, Instagram or Clubhouse. I would love to connect with you.
Perry Carpenter: This show was written, recorded, sound designed and edited by me, Perry Carpenter, with additional research by Nyla Gennaoui. Today's show also featured the voice talent of Christina Lee (ph) as Janet, the digital assistant.
Perry Carpenter: The artwork for "8th Layer Insights" is designed by Chris Machowski at ransomwear.net - that's W-E-A-R - and Mia Rune at miarune.com. The "8th Layer Insights" theme song was composed and performed by Marcus Moskette (ph).
Perry Carpenter: Until next time, I'm Perry Carpenter, signing off.