Hacking Humans 5.13.21
Ep 147 | 5.13.21

How to best fight fake news.

Transcript

Helen Lee Bouygues: Social media platforms make money when eyeballs stay on their platforms.

Dave Bittner: Hello, everyone, and welcome to the CyberWire's "Hacking Humans" podcast, where each week we look behind the social engineering scams, the phishing schemes and the criminal exploits that are making headlines and taking a heavy toll on organizations around the world. I'm Dave Bittner from the CyberWire, and joining me is Joe Carrigan from the Johns Hopkins University Information Security Institute. Hello, Joe. 

Joe Carrigan: Hi, Dave. 

Dave Bittner: We've got some good stories to share this week. And later in the show, Helen Lee Bouygues of the Reboot Foundation - we're going to be discussing social media's effect with the misinformation ecosystem and how users can best fight fake news. 

Dave Bittner: All right, Joe, before we dig into our stories this week, we got a kind note from a listener whose name is Jonathan (ph). He wrote in with an interesting question. He writes - he says, I've been experimenting with two-factor authentication devices and standards. While I've used time-based one-time passwords for years, I'm looking into using a universal second-factor YubiKey for improved security. 

Joe Carrigan: Good idea. 

Dave Bittner: So far, so good. Right? 

Joe Carrigan: Yep. That sounds great. 

Dave Bittner: He says, last week I saw this story from Gizmodo encouraging everyone to use device-based two-factor authentication rather than OTP codes - that's one-time password codes. OTP codes you type in are still vulnerable to phishing on a carefully crafted site... 

Joe Carrigan: Yes, they are. 

Dave Bittner: ...While with UTF, the site you're authenticating with establishes communication with the two-factor authentication device directly, which can't be spoofed like typing in a six-digit code. 

Joe Carrigan: Yes. 

Dave Bittner: However, many sites require a mobile phone number or, more likely, a one-time OTP when you first enable two-factor authentication. When you can add universal two-factor, often this is in addition to SMS or OTP. How does that make U2F more secure than OTP? Aren't you just adding another means of authentication? This seems less secure because you have more ways to authenticate. If I have SMS-based authentication and add OTP, that isn't really more secure than SMS unless SMS is disabled. What do you think? Thanks for your great show. 

Dave Bittner: Well, what do you think, Joe? 

Joe Carrigan: I agree. Jonathan makes a great point here. I think I said this a couple of episodes ago, that the security of your account is only as strong as the weakest link in the security chain, if you will. So if I can say I've lost my universal two-factor device, my UTF, my YubiKey, and the website goes, well, that's OK, we'll just send you a text message - if that's their workflow, it becomes a lot easier for me to get around that security implement of the universal two-factor device, which, so far, we haven't found any vulnerabilities in that protocol. But if I can get around that - and let's say I've SIM-swapped the target's phone and get you to send me, the attacker, a one-time password via SMS, then I'm in. That's it. 

Dave Bittner: Right, right. 

Joe Carrigan: This is a good observation from Jonathan. 

Dave Bittner: It's making your potential attack surface larger... 

Joe Carrigan: Yep. 

Dave Bittner: ...As he notes. And also - I guess it is a case of you're only as good as your weakest link. 

Joe Carrigan: That's right. 

Dave Bittner: Yeah. So if you can, disable those. If you get yourself a hardware key, make sure that you've gone through and disabled the other ones. 

Joe Carrigan: Right. Usually, the way they protect your account in the event that you lose your hardware token is they'll give you a list of one-time passwords you can use that you can print out and you can keep in a safe somewhere. And that's a fairly secure way of doing things assuming you can secure that piece of paper. 

Dave Bittner: Yeah. Just tack it up to the front door of your house. 

Joe Carrigan: Right. 

Dave Bittner: Yeah (laughter). 

Joe Carrigan: Don't do that. 

Dave Bittner: (Laughter) All right. Well, thanks to Jonathan for sending in that note. It's a good question. Of course, we would love to hear from you. You can send us your questions or your Catch of the Days to hackinghumans@thecyberwire.com. 

Dave Bittner: All right, Joe, let's jump in with some stories here. Why don't you kick things off for us? 

Joe Carrigan: Dave, I'm going way back to 2015 here. That is when a woman in Scotland named Patricia Reilly worked at a company called Peebles Media Group. She worked there as a debt collector, presumably in their accounts receivable department. 

Joe Carrigan: One day, her managing director, Yvonne Bremner, went on vacation. And on the day that Ms. Bremner went on vacation, Ms. Reilly received emails from someone posing as Ms. Bremner. Right? Now, Ms. Bremner is the managing director of this organization, Peebles Media. The email asked for a transfer of 24,800 pounds to be made to another company. And Ms. Reilly worked with her direct supervisor, her line manager, who made the online payment in the requested amount. 

Joe Carrigan: Three days later, she receives another email that seems to be for Ms. Bremner, which asked for another transfer of 75,200 pounds. Ms. Bremner is still out of the office, but so is Ms. Reilly's line manager. So in the end, Ms. Reilly goes ahead and makes these payments, and she winds up sending 193,250 pounds to these scammers. 

Dave Bittner: Wow. 

Joe Carrigan: So what happens next? Ms. Reilly is fired and then Peebles Media Group sues her, right? And they want to get back 107,000 pounds from the employee and they say that she's liable. 

Joe Carrigan: And there's a write-up of this case from the BBC in 2019. And in this write-up, Ms. Bremner, frankly, has some nasty things to say about Ms. Reilly. She says that Reilly is the office gossip and that she underperforms in her role as a collector and that Ms. Reilly was not authorized to make payments on behalf of the company. But she did, in fact, have access to the banking system. 

Dave Bittner: Well, whose fault is that? 

Joe Carrigan: Yeah, that's an excellent question, Dave. 

Dave Bittner: (Laughter) OK, go on. 

Joe Carrigan: That was one of the things I'm saying as I'm reading this. I'm going, wait a minute. If she shouldn't have been able to make payments, if she was not authorized, why does she have access to the banking system? 

Dave Bittner: Right. All right. Go on. Go on. 

Joe Carrigan: Peebles publishing lost that case. 

Dave Bittner: Oh, OK. 

Joe Carrigan: And last week, they lost their appeal of that first decision. Now we're in 2021. We're six years beyond this event. And the case is, I think, now resolved. I don't know if they can appeal again. And I'm not familiar with the U.K.'s legal system. 

Joe Carrigan: The judge in the appeals case had some important things to say. And one of the things he said is it's rare for an employer to sue a junior employee for negligence. And that's what Ms. Reilly was, a junior employee. They should have had some insurance for this kind of event. He said if this becomes a regular event, it, quote, "will have a significant impact on the employment relationship." So he doesn't want this to become precedent is what he's saying. 

Dave Bittner: Oh, OK. 

Joe Carrigan: Right. 

Dave Bittner: I see. 

Joe Carrigan: Because I don't want every employer turning around and suing every employee for losses. 

Dave Bittner: Right. 

Joe Carrigan: Right. That would be a problem. The judge said that Ms. Reilly genuinely thought she was being instructed to make payments by her managing director and that she was neither trained nor experienced in the payment of creditors. So the workflow, though it seemed off, it would not have seemed off to Ms. Reilly because she was not trained for this or hadn't done it in the past. 

Dave Bittner: I see. 

Joe Carrigan: And she did not suspect what was truly going on, and she could not reasonably have been expected to do so. She didn't know she was being scammed and had no reasonable recourse. 

Joe Carrigan: Now, this whole case is interesting because it sets a - I don't know how legal precedent works in the U.K., but it's interesting that this company sued this woman for something that happened six years ago. The case had been going on for a long time, but the judge has said, no, no, this is not happening. We're not doing this. So I think this is a lesson to companies out there. 

Dave Bittner: Yeah. I wonder, you know, what if she had received training - right? - the type of training we talk about here all the time, you know, to alert your employees? Would the company have had a better case had she been trained but even despite her training then went ahead and sent the money? I don't know. 

Joe Carrigan: I don't know. I think the key problem with this case is that she should not have been authorized to access the banking system, but she was able to do so. 

Dave Bittner: Right. 

Joe Carrigan: That sounds like an authorization problem. That's the second A in the AAA, the authentication, authorization and auditing. It's a fundamental part of security. It sounds like a failure of the authorization part of their process here. 

Dave Bittner: Yeah. And most companies that I'm aware of, if you have to write a check more than a certain amount, a lot of times that requires two signatures. 

Joe Carrigan: It does, yeah. 

Dave Bittner: Just for these reasons. You get a second set of eyes on it. And as we always say, slow down. Slow down. 

Joe Carrigan: Slow down. That's right. 

Dave Bittner: Yeah, that's fascinating. It's an ending that I'm OK with (laughter). 

Joe Carrigan: Right. Yeah, me too. 

Dave Bittner: Yeah. I don't think she's to blame here, and she shouldn't have been authorized to make those payments, certainly not those sorts of amounts. One thing to have, you know, access to the petty cash drawer or something, but when you're talking about hundreds of thousands of pounds. 

Joe Carrigan: Yeah. 

Dave Bittner: Wow. 

Joe Carrigan: That's right. 

Dave Bittner: All right. Well, interesting story for sure. We'll have a link to that in the show notes. 

Dave Bittner: My story this week comes from SC Magazine, written by Bradley Barth. And the title of the article is "BazarBackdoor Phishing Campaign Eschews Links and Files to Avoid Raising Red Flags." This is interesting. It was based off of some research that the folks over at Cofense published. They're a security company. And, you know, we always talk about how you should never click on the links. 

Joe Carrigan: Right. 

Dave Bittner: And what's happening here - what the folks at Cofense are tracking is that the bad guys know that we are not clicking on the links the way we used to. And so they're taking things to another level. 

Dave Bittner: There are some phishing emails. There's one that they spotted back in February, and it was an order confirmation message from a fake pharmaceutical company. And they're saying that there have been other communications pretending to be from office supply companies, from flower deliveries and even lingerie companies (laughter). So what happens is you get an email that talks about how your order was canceled or something like that. But instead of having a link for you to click on, there's a phone number. 

Joe Carrigan: Really? 

Dave Bittner: Yeah. And it says, please call this number to resolve this issue. And so you call the number, and there's a friendly customer service rep there waiting to answer the phone. 

Joe Carrigan: I see. 

Dave Bittner: And from that point, that's where they get you. That's where they try to convince you to go to a website that is a malicious website. 

Joe Carrigan: I see. 

Dave Bittner: And on that website, that is where the malware will be installed. 

Joe Carrigan: Very clever. 

Dave Bittner: And off you go, right? 

Joe Carrigan: And how do they get the malware installed? Do they use a vulnerability in a browser, or they actually convince - at that point in time, convince the person to download and install something? 

Dave Bittner: It seems like it's a combination of those things... 

Joe Carrigan: OK. 

Dave Bittner: ...Depending on how things go. I think initially they just try to take you to a website that just does everything on its own. But if they can't do that - for example, it talks about how they'll talk you through downloading an Excel file, for example... 

Joe Carrigan: Right. 

Dave Bittner: ...And disabling the protections within Excel itself to keep you from running macros... 

Joe Carrigan: Yup. 

Dave Bittner: ...And those sorts of things and, you know, do it in a very convincing way. This article points out that, as always, you know, never call a number you receive in an email, never follow the email link, right? 

Joe Carrigan: Right. 

Dave Bittner: Go straight to the vendor's website yourself. 

Joe Carrigan: Yeah. This is the problem with email, though, isn't it? Email - and I've said this before. I'll say it again because I like repeating myself. 

Dave Bittner: (Laughter). 

Joe Carrigan: Email is the only service out there on the internet that I can think of, you know, that we all have that anybody can just put something in our inbox. 

Dave Bittner: Yeah. 

Joe Carrigan: Right? We don't know who that person is. We have no clue of who that person is behind the email unless we go through the forensic activity of looking at the email headers and seeing what happened, how the email actually arrived to us. And most people can't go through that and understand what they're looking at because you really have to have an understanding of the email protocols to get a handle on it. Email is terrible. We need another solution. 

Dave Bittner: (Laughter) It is amazing that we haven't come up with one. Because there have been attempts over the years to graft things on to email to make it better... 

Joe Carrigan: Right. 

Dave Bittner: ...And none of them have reached enough universal acceptance to really make them stick. 

Joe Carrigan: Yeah. There are verification techniques you can use to make sure that an email is coming from the server that it claims to be coming from. 

Dave Bittner: Yeah. 

Joe Carrigan: But if you don't enact those verifications, then you can still run a simple SMTP server. 

Dave Bittner: Right. What's interesting to me about this is that, you know, these bad guys are investing in a call center (laughter). 

Joe Carrigan: Right. 

Dave Bittner: Right? 

Joe Carrigan: They're investing in infrastructure. 

Dave Bittner: Right. Right. 

Joe Carrigan: We've had stories before about how some of these call centers, like in India, are - actually kind of pull double duty, right? They have their own legitimate business practices... 

Dave Bittner: Right. 

Joe Carrigan: ...But every now and then, they'll take on one of these illegitimate business practices to fill time. They don't really vet their vendors, or they don't really care. They're complicit in it. And they get high commissions on these things... 

Dave Bittner: Yeah. 

Joe Carrigan: ...Because it's pretty much all profit. 

Dave Bittner: Yeah. A call comes in, and they've got a script. All right. Well, we will have a link to that story in the show notes as well. 

Dave Bittner: Joe, it is time to move on to our Catch of the Day. 

(SOUNDBITE OF REELING IN FISHING LINE) 

Joe Carrigan: Dave, our Catch of the Day comes from a listener named Wyatt (ph). And he says, hello, Dave and Joe. "Hacking Humans" is one of my favorite shows. 

Joe Carrigan: Hey, that's great. 

Dave Bittner: Thank you very much. 

Joe Carrigan: It's one of my favorite shows, too. 

Dave Bittner: (Laughter). 

Joe Carrigan: I love all you guys do at the CyberWire. I have two Catches of the Day for you. You can do them both or pick your favorite. 

Joe Carrigan: We're going to actually do both of them. We're going to do one this week and one next week. 

Joe Carrigan: So the first one is a phishing email he received, and he couldn't resist playing with whoever sent it. Since they verified their story by linking to a CNN article, I decided to Google random Wyatts who had won lotteries so that he could verify my story. I even went so far as to temporarily change the name of my Google account to match this Wyatt who had won a lottery. 

Joe Carrigan: So Wyatt here is actually, you know, getting into his own social engineering practice here. 

Dave Bittner: (Laughter). 

Joe Carrigan: But these folks - or these scammers sent him an email claiming to come from Jason and Barbara Wood (ph), who have a Japanese email address. It ends in .JP, which is Japan - a top-level domain for Japan. 

Dave Bittner: Interesting. 

Joe Carrigan: So, Dave, why don't you read the email from the scammers? 

Dave Bittner: All right. It goes like this. 

Dave Bittner: (Reading) My wife and I won the Mega Millions jackpot for $410 million U.S. on June 9, 2020, and we voluntarily decided to donate the amount of $10 million U.S. for charity. We're trying to reach random people from various sources and fashions to touch life from various angles. Therefore, you get the message here. You've been registered as one of the lucky recipients to receive $2 million U.S. This donation is given to you, allowing you to take care of your personal problems and in large part to generously helping us reach out to give to the less fortunate, orphans and charitable organizations in your neighborhood locality. To verify, here's a link to a CNN story. Get back to me on how to receive the donation. Thank you, Jason and Barbara Wood. 

Joe Carrigan: (Laughter) Wyatt responded to this email. He says, (reading) this is a very kind gesture, and I'm so happy that you're trying to give some of your money to charity. To be honest, however, I don't really need the money. In 2007, I, too, won the lottery. Yup, I won the Louisiana Lottery for $1 million and received $469,000 after taxes. I decided to use that money to quit my job at the phone company and start my own bulldozer rental company. 

Joe Carrigan: (Reading) On November 8, 2010, I decided to take $1,000 out of my savings and buy exactly 4,000 bitcoin. It was really a freak accident, but when I saw that these bitcoins hit 25 cents a coin, I decided it would be perfect time to buy an even amount of them. I'm very OCD. Since then, I have bought and sold various cryptocurrencies, and I am now at the point where I am even richer than you are. The $2 million doesn't interest me. I think it's wonderful that you are looking to spend money on charity. And when you find a good one, let me know so that I can match your donation. Thank you very much for what you are doing. 

Dave Bittner: (Laughter). 

Joe Carrigan: So that's a good response, Wyatt. 

Dave Bittner: Yeah. 

Joe Carrigan: I'd be interested to hear if they ever wrote back to him. 

Dave Bittner: Well, chewing up some of their time... 

Joe Carrigan: Yup. 

Dave Bittner: ...Which is good. But also, I mean, you know, it's good - it's good-natured. 

Joe Carrigan: Yes. 

Dave Bittner: And I like it. It's in the spirit of fun. 

Joe Carrigan: We'll have another one from Wyatt next week as well. 

Dave Bittner: (Laughter) All right. Terrific. Well, thank you, Wyatt, for sending that in. Again, if you have a Catch of the Day for us, you can send it to us at hackinghumans@thecyberwire.com 

Dave Bittner: Joe, I recently had the pleasure of speaking with Helen Lee Bouygues. She's from an organization called the Reboot Foundation, and our conversation centered on social media's effect within the misinformation ecosystem and how folks can best fight fake news. Here's my conversation with Helen Lee Bouygues. 

Helen Lee Bouygues: Social media platforms make money when eyeballs stay on their platforms longer. So their algorithm and the way that they function visually is configured in order to prey on people's emotions so that people stay online longer. And that is fundamentally one of the problems that is linked to why we have so much poor news judgment because obviously, when people are being emotional, it's hard for people to think rationally. 

Dave Bittner: But when you say poor news judgment, what do you mean by that? 

Helen Lee Bouygues: Based on the Reboot Foundation studies, we have done research that shows that when people stay on social media longer, their ability to identify fake news or misinformation increases incrementally and exponentially. So that's what I mean by poor news judgment. 

Dave Bittner: We sort of half-joke about how a lot of us throughout the pandemic are doing doomscrolling - you know, waking up in the middle of the night and finding ourselves - you know, you blink your eyes, and you spend an hour going through Twitter or TikTok or any of the other platforms. And they really do suck you in. 

Helen Lee Bouygues: And that's exactly the point about how they prey on people's emotions to suck us in and actually stay online. In addition to sort of the visual attractiveness of people to stay online, again, when we talk about configuration of algorithm, these social media platforms deliberately will look at what you've clicked on, what you've liked and what you've scrolled through previously and try to show you more and more articles or opinions that are in line with what you previously looked at. So in other words, these platforms are deliberately helping you do selective thinking and hence creating, truly, an echo chamber. 

Dave Bittner: Is it even to the point where, for example, if I'm scrolling through something in, like, Twitter - you know, I'm scrolling through a Twitter feed and I pause on something to read it, is the fact that I've stopped scrolling, that I've paused there - is that registered as me showing interest? 

Helen Lee Bouygues: Perhaps not the time period, but more the click-throughs of the types of articles. So obviously, if you have subscribed to and are clicking through more pro-Trump articles or tweets, then what's going to naturally show up on your Twitter. Even if you subscribe to pro-Biden sites, you're more likely going to be shown more pro-Trump types of articles and eventually clickbaits. 

Dave Bittner: Now, I suppose one of the other major issues here is that these algorithms are opaque. I mean, we really don't have any real view inside what they're doing. 

Helen Lee Bouygues: You're absolutely right. And today, there is a lot of discussion around Section 230, which is really the law that protects these social media platforms from potential litigation liability for disseminating misinformation. 

Helen Lee Bouygues: However, in reality, the real challenges that we have is not just about the fact that they might be spreading misinformation by showing information. The real challenge is the fact that there is very little to no transparency as to why we're seeing what we're seeing on social media. 

Helen Lee Bouygues: And even though most people actually believe that they're quite good at identifying fake news, based on one of the surveys that the Reboot Foundation did, less than 1% of the population actually applies true fact-checking techniques when they're reviewing articles. 

Dave Bittner: Can you give us a little bit of the background on the Reboot Foundation itself and the mission of the organization? 

Helen Lee Bouygues: So the Reboot Foundation we started in 2018. And the original goal was really to try to elevate critical thinking both in schools, in the families, as well as in the workplace. And the origin of this very narrow focus on critical thinking really came from a personal story, where I noticed that my daughter, who's now 10 - so back then, she was 7 years old - predominantly gathers information online and not offline. 

Helen Lee Bouygues: In other words, we just talked about social media, but it's not just social media. Even information gathering by a Google or other information searches were actually all subject to algorithms, again, that purview what sites you've been to before. And so there's a bit of selective thinking going on even in terms of our information gathering methods. And so with that, that was really the genesis of the Reboot Foundation. 

Dave Bittner: And how do you go about achieving your goals? What sort of resources are you making available? 

Helen Lee Bouygues: So at the Reboot Foundation, we publish different articles and studies, and we fund different researchers that study both ways of improving critical thinking, but also more recently around fake news and the dissemination of misinformation. 

Helen Lee Bouygues: Now, you may say, David, what's the link between critical thinking and fake news and misinformation? Well, our subjectivity to fake news is a bit of a consequence of the fact that we're not doing as much critical thinking. So there is a direct link. 

Helen Lee Bouygues: But in addition to research and studies that we publish, we've actually created two different types of guides as well. One is a guide for parents in how to engage in critical thinking education for your children. And we've divided that up into different age groups because, obviously, a parent might not feel as comfortable addressing critical thinking subjects with a 16-year-old as one would with a 6-year-old. And we've more recently also published a teachers' guide, working with teachers across the country for different school subject matters, of how to better integrate critical thinking in their school curriculum. 

Dave Bittner: You know, I remember as a middle schooler having a really inspiring science teacher who was really deliberate about instilling us with a sense of the importance of critical thinking skills. And it's something we spent time on. And then, you know, as I got older - I remember Carl Sagan's, you know, "Baloney Detection Kit" (laughter) and... 

Helen Lee Bouygues: (Laughter). 

Dave Bittner: ...And reading "The Demon-Haunted World," which was another - you know, a book that really focuses on a lot of these skills. And I guess where I'm going with this is that as a teenager, you know, growing up in the '80s, I would have hoped we would've been farther along than we are. It's frustrating to me that, in some ways, it feels as though - I suppose thanks to social media and some of the other things like fake news, it feels as though perhaps we're slipping when it comes to these things. 

Helen Lee Bouygues: There's two things. One is you were lucky enough that you had this science teacher when we were - when you were in middle school. Unfortunately, one of our research reports demonstrates that, basically, over a third of middle school students in the U.S. say that they've rarely or never learned how to judge the reliability of sources. And over 50% of teachers that we've interviewed say that they don't actually methodically, systematically teach critical thinking in their curriculum. 

Helen Lee Bouygues: Now, I'm not saying that's, you know, necessarily the fault of teachers because, obviously, our education system today is so much bombarded with test-based teaching. And so, you know, there's a lot of material that they have to go through. But the fact of the matter is our schools today are not doing enough teaching critical thinking skills, let alone media literacy skills, both of which are critical to really arm our children when they are gathering information, especially via social media, just as much as adults today are gathering over 90% of their news information via social media platforms. 

Helen Lee Bouygues: And you can imagine, David, if these children are not developing these skills in school and they're subject to TikTok, which in the beginning might just look like a small gimmick - you know, learning a few dance moves or some cutesy videos - but in reality, our children are actually gathering information from sources like TikTok, where they don't necessarily have the tools to be able to challenge when some misinformation might be distributed to them. 

Dave Bittner: So for those of us who are trying to fight the good fight here and want to help our friends, our family, our loved ones and better equip them with these sorts of skills, what do you recommend? What are some of the things we can be doing? 

Helen Lee Bouygues: I think one thing that we all should do is take some time of doing digital detox. You know, something that you mentioned earlier, David, about automatically scrolling, especially during this COVID period, hours and hours on Twitter and different forums - it's good to just take a breather. It can be complete digital detox or consciously saying, I'm only going to gather information via deliberately going on websites of news sources rather than doing clickthroughs via the social media. So that's one dimension. 

Helen Lee Bouygues: The second is pure awareness, reminding ourselves, just like we need to do the same education for children in media literacy, is to take a pause, see who the author is. What are the sources of what we're reading? 

Helen Lee Bouygues: And thirdly, one of the critical things I think that we need to do to pull away ourselves from these echo chambers is to deliberately, conscientiously try to review and understand and reach out to those with an opposing view. Part of the challenges that we have really with the social media is not just misinformation or disinformation; it's the fact that we become so tunnel-visioned because we're only seeing information from those who think like us, whether or not it's facts or not. 

Dave Bittner: I think it's a fascinating idea. I mean, you know, people get so dug in these days, especially with the degree of polarization that we have. I wonder if exposing people to critical thinking skills can kind of be a way to soften those defenses, to - 'cause you're not coming directly at their beliefs. But you're just saying, hey, look; here's a set of tools for evaluating the things in your life, and hopefully that'll help them get to better ways of establishing what's so and what's not. 

Helen Lee Bouygues: I'm a fundamental believer that people want to be good consumers of information. And again, we're not aided - right? - because everything is about instant gratification. What we're being exposed to is basically similar thinking. It really requires a special effort to deliberately reach out and seek opposing views. 

Helen Lee Bouygues: One of the surveys that the Reboot Foundation did was a question around how often people actually seek out opposing views. Well, the irony was 1 in 4 people will deliberately avoid people who have opposing views. Now, this is before COVID, so this is when people could actually communicate. But, you know - so we just need to be conscientious of that because I think a lot of it has to do with awareness. 

Helen Lee Bouygues: And I completely agree with you, David. If we deliberately make that effort to try to consider opposing views, not only do you actually reinforce your own argumentation, and so you could even reinforce your own convictions by reviewing opposing views, but hopefully we can try to somewhat limit some of the polarization that we're seeing in today's society. 

Dave Bittner: Joe, what do you think? 

Joe Carrigan: Bam. Helen has hit this - hit one out of the park as far as I'm concerned. 

Dave Bittner: OK. 

Joe Carrigan: Social media makes its money by keeping your eyes on the page. And these apps are designed to keep you engaged, and they use your emotions to do it. There are even some people who've said that Facebook is - and Twitter are designed to be addictive. These are people who have left the company and said, I can't do this anymore. You know, you can think of Facebook and Twitter as the modern-day tobacco companies almost. 

Joe Carrigan: I took a look at the research Helen is referencing here at the start of the interview. There's a couple of interesting things that says. No. 1, the more people use social media, the worse their judgment. And this relationship held true even after excluding power users. So people who spend more than 10 hours a week on social media are excluded. And if you spend less than that, among that group of people, the more time you spend, the worse your discernment is. There's a lot of research on the organization's website. You should check it out. 

Joe Carrigan: I would like to point out something about the transparency problem with these algorithms. You spent a good bit of time talking about the algorithms and how we're presenting with these things. These are machine-learning algorithms. One of the big problems of machine learning is explainability. In other words, if you put a gun to the head of these social media companies and said, explain why this story shows up on my page, they would not be able to answer you. They don't know how. And that's kind of the big secret. The secret is not what the algorithm is, right? The algorithm is probably some kind of neural network or some other well-known machine-language algorithm. 

Dave Bittner: Yeah. 

Joe Carrigan: The secret is they don't know why you're seeing these things. What's happening is they've given the algorithm the goal of keeping you engaged, keeping you on the site. And the algorithm, with cold, methodical precision and complexity beyond human understanding, is doing its job. That's the problem - or one of the problems here. There's lots of problems. 

Dave Bittner: (Laughter). 

Joe Carrigan: But I think that's a key takeaway that everybody should have. 

Joe Carrigan: When she starts talking about education, it seems like a lot of modern education is based on regurgitation. We spend a lot of time teaching to tests, right? That's not good for anybody. It doesn't avail the student of critical thinking skills. That is almost something that has to be taught at home nowadays. And her website has a list of tools on it so you can help talk to your kids about critical thinking. 

Joe Carrigan: The lack of opposing viewpoints is a huge problem in social media. It... 

Dave Bittner: Right, the bubbles. 

Joe Carrigan: The bubbles. These algorithms are designed to keep you engaged, like I was just talking about, and that means that you don't get the opposing viewpoints because that tends to make you go, I don't want to hear about this, and walk away. 

Dave Bittner: (Laughter) Right. Let me walk away and think about that for a minute. 

Joe Carrigan: Right, exactly. No, no, don't walk away. Keep listening to the same thing over and over again. 

Dave Bittner: (Laughter) Right, right, right. 

Joe Carrigan: Digital detox - I talk about this. I have uninstalled all of the social media apps from my phone, with the exception of Facebook Messenger, which I use to communicate with family members. I cannot sit on my phone anymore and scroll through Twitter, as you put it, doomscrolling, you know? 

Dave Bittner: Yeah, right. 

Joe Carrigan: I just can't do that. 

Dave Bittner: Yeah. 

Joe Carrigan: And, you know, every now and then, I go, I would like to be on Twitter right now, but I'm like, no, I'm not putting it back on my phone. I'm just not doing it - or Facebook or anything. 

Dave Bittner: Yeah. 

Joe Carrigan: It's just not there. 

Joe Carrigan: And finally, I have to say this again. I've been saying this for years now. One, do not get your news from social media. Nobody should get their news from social media. If you see a news article on social media, just ignore it. 

Dave Bittner: (Laughter). 

Joe Carrigan: Don't even pay attention to it. 

Dave Bittner: Right, right. 

Joe Carrigan: Go to a trusted news source. Get your news from a trusted news source. Make sure that you have a news source that sometimes has an opinion page that's different from your opinion - right? - and read their article. Look at AllSides media. They do a pretty good job of showing you everything about a single story that - how different news organizations are reporting it. 

Dave Bittner: Yeah. 

Joe Carrigan: That's a great website. And I'm going to say this again. Social media is not a valid platform for political discussion. It is a valid platform for division and for insults. 

Dave Bittner: (Laughter). 

Joe Carrigan: It is not conducive to political discussion. You cannot have a constructive political discussion on social media. It cannot be done. 

Dave Bittner: (Laughter) OK. You know what - you know, Joe, what happens when you deal in absolutes. 

Joe Carrigan: Right, yeah. 

Dave Bittner: You become a Sith Lord. So... 

(LAUGHTER) 

Joe Carrigan: Angry old man yells in the microphone. 

(LAUGHTER) 

Dave Bittner: You know, just this past week, I was scrolling through Twitter. And it struck me - a story came by, and it was a horrible, heartbreaking story about something terrible that had happened. 

Joe Carrigan: Right. 

Dave Bittner: But it was in Brazil. And it was a small-town story about a tragedy that took place. And it - I thought to myself, I don't want to see this. 

Joe Carrigan: Right. 

Dave Bittner: Like, OK, I - you know, this is a terrible story. It makes me sad. But in no way, shape or form does this affect my life. I can't do anything about it, right? It's far away. It involves no one I know, no one I will ever meet. It's half a world away from me. All seeing this story does is put me in a bad mood. 

Joe Carrigan: Yeah. 

Dave Bittner: I wish I could tell Twitter, knock it off. 

Joe Carrigan: Right. 

Dave Bittner: (Laughter) You know, I don't want to see these things. And I suppose it's a slippery slope or a tricky thing because that's how you build your bubble, right? 

Joe Carrigan: Right. Yeah, yeah. 

Dave Bittner: Yeah, so... 

Joe Carrigan: Although if you don't want to see those stories anymore, when you see that story come up, close Twitter. 

Dave Bittner: Now, let's not get carried away, Joe. 

(LAUGHTER) 

Joe Carrigan: The algorithm will go, oh, I showed that story to Dave, and he didn't like that. 

Dave Bittner: Yeah, yeah, yeah. All right, all right (laughter). 

Dave Bittner: Well, again, our thanks to Helen Lee Bouygues from the Reboot Foundation for joining us. We do appreciate her taking the time. 

Dave Bittner: That is our show. We'd like to thank all of you for listening. And, of course, we want to thank the Johns Hopkins University Information Security Institute for their participation. You can learn more at isi.jhu.edu. 

Dave Bittner: The "Hacking Humans" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner. 

Joe Carrigan: And I'm Joe Carrigan. 

Dave Bittner: Thanks for listening.