Let's talk about Facebook's research.
Steven Levy: You would think that all steps, any step possible, would be taken to change that situation. But those steps weren't taken. They were saying, well, you know, gee, if we change that, people will use Instagram less.
Dave Bittner: Hello, everyone, and welcome to "Caveat," the CyberWire's privacy, surveillance, law and policy podcast. I'm Dave Bittner, and joining me is my co-host, Ben Yelin, from the University of Maryland Center for Health and Homeland Security. Hello, Ben.
Ben Yelin: Hello, Dave.
Dave Bittner: On this week's show, Ben shares a newly decided court case on whether Big Tech companies can be sued under the anti-terrorism statute. I've got the story of some warrantless surveillance being declared unconstitutional in Colorado. And later in the show, my conversation with author and journalist Steven Levy. He's editor-at-large at WIRED, and his most recent book is "Facebook: The Inside Story." He joins us with insights on Facebook's internal research teams.
Dave Bittner: While this show covers legal topics and Ben is a lawyer, the views expressed do not constitute legal advice. For official legal advice on any of the topics we cover, please contact your attorney.
Dave Bittner: All right, Ben, let's jump right into our stories here. Why don't you kick things off for us?
Ben Yelin: So I have a really interesting case that was decided in the 11th Circuit Court of Appeals. Once again, I saw one of my favorite Twitter followers, Gabriel Malor, who always alerts me to interesting appeals court cases and flagged this one the other day.
Ben Yelin: So this case concerns the Pulse nightclub shooting in 2016. Omar Mateen, who was not an official affiliated member of ISIS, but was inspired by ISIS, committed really an atrocity at the Pulse nightclub in Orlando...
Dave Bittner: Right.
Ben Yelin: ...An LGBTQ nightclub, I think killed upwards of 50 people. He was later shot by law enforcement. The question in this case is whether Big Tech companies - and the three named here are Google, Facebook and Twitter - can be sued as aiding and abetting this level of terrorism.
Ben Yelin: So the allegation is that Mr. Mateen became radicalized in his views via social media. He was reading propaganda tweets, Facebook posts. He was able to communicate with, you know, known ISIS leaders via these platforms. He was watching YouTube videos, which is part of the Google family. So the question here is, do the survivors of the attack and the relatives of those who perished have a cause of action?
Ben Yelin: There's something called the Anti-Terrorism Act - a 1990s-era piece of legislation, a federal law - that says people have a cause of action against anybody who aids and abets acts of international terrorism. So these plaintiffs filed suit against these three companies, alleging a violation of that act. And they're looking for monetary damages.
Dave Bittner: Interesting.
Ben Yelin: The complicating issue is that the ATA - the Anti-Terrorism Act - only applies to international terrorism.
Dave Bittner: Right.
Ben Yelin: So the question here is, is this international terrorism? And that's where things get interesting.
Dave Bittner: But he's not an - I mean, he's domestic, right?
Ben Yelin: Yes.
Dave Bittner: He's a citizen of the United States.
Ben Yelin: He is.
Dave Bittner: The crime was committed in the United States.
Ben Yelin: He lived in Florida.
Dave Bittner: Yep.
Ben Yelin: Florida, last I checked, is still part of the United States.
Dave Bittner: (Laughter) Bugs Bunny has not yet succeeded in sawing it off.
Ben Yelin: In sawing it off, yeah...
Dave Bittner: (Laughter).
Ben Yelin: ...Despite our wildest dreams. Just kidding. We love you, Florida.
Dave Bittner: We love your theme parks.
Ben Yelin: Yeah. So...
Dave Bittner: And your sunny beaches (laughter).
Ben Yelin: So yes, this was a domestic terrorist attack. It took place domestically. It was a U.S. person who committed the attack. But he was radicalized by ISIS, which is an overseas group and is recognized as a terrorist organization by our State Department.
Ben Yelin: Unfortunately for the plaintiffs here, the test for international terrorism includes the following. It has to be - it has to involve a violent act or an act dangerous to human life. You can check that one off the list. It has to appear to be intended to intimidate or coerce civilian population. I think you can make an argument that it's doing that here.
Dave Bittner: Right.
Ben Yelin: The third part of the test is it either has to occur primarily outside the territorial jurisdiction of the United States or, quote, "transcend national boundaries in terms of the means by which they are accomplished." So either it has to take place abroad, or it's an attack that takes place here that's supposed to have a - you know, broader implications as part of some sort of global jihad.
Dave Bittner: OK.
Ben Yelin: What the court here is saying is the plaintiffs did not meet that third part of the test. While, you know, this is obviously a complete tragic scenario, he isn't an affiliated member group of an international terrorist organization. This wasn't part of a larger terrorist plot that threatens the broader national security of the United States. Just because he's claiming association with a known terrorist group doesn't make it so that this is - meets that definition of occurring either outside our jurisdiction or something that transcends national boundaries.
Dave Bittner: So because he was being inspired by international terrorists but was not being directed by them, right?
Ben Yelin: Exactly. Now here's where it gets complicated. So there have been other cases - I think we've probably - I think we've talked about a couple of them...
Dave Bittner: Yep.
Ben Yelin: ...Along these lines, where somebody is radicalized online. In those cases, the terrorist group in question didn't claim any association with the terrorist in question.
Dave Bittner: OK.
Ben Yelin: That's not the case here. ISIS claimed responsibility for the attack. So Mateen - Omar Mateen - said a few days before the attack in one of the online forums, you know, look for a terrorist attack in the next few days in the United States. And ISIS, in the day or so after the attack, said, we claim responsibility for it. And you know, we were at war with ISIS at the time. It was at the height of their strength in Iraq, especially, I think, in 2016.
Ben Yelin: So you know, I'm not sure if this necessarily meets that definition of transcending national boundaries. But the result of the case here is that these tech companies, despite facilitating conversations between this terrorist and terrorist organizations, are not going to be held liable. And I'm not sure that people are going to be pleased with the result here.
Dave Bittner: What about Section 230 here? I mean, doesn't that in itself get them off the hook?
Ben Yelin: Well, so there's an interesting interplay between Section 230 and the Anti-Terrorism Act. I don't think we've seen this fully tested in court. So yes, you can't be held liable for things that are posted on a platform. But we haven't gotten to the merits of a case on how that interplay is with the Anti-Terrorism Act, where the specific action is for aiding and abetting members of an international terrorist organization. So it's possible that's not something that's covered under Section 230. But we're not - we didn't get to that part of the case because it was thrown out on this jurisdictional issue.
Dave Bittner: I see. Is that the end of it? Could it go - can it go on? Or are we done with this?
Ben Yelin: We aren't done with this case.
Dave Bittner: OK.
Ben Yelin: I mean, the plaintiffs could appeal and potentially try and, you know, get cert with the United States Supreme Court. I don't necessarily see that happening. I haven't seen a sufficient split among circuits on this issue that would justify the Supreme Court stepping in. And you know, so I do think is the end of the requests for relief from these plaintiffs, who have obviously suffered this enormous tragedy.
Dave Bittner: Right.
Ben Yelin: There is an obvious step that can be taken here. Our Congress critters...
Dave Bittner: (Laughter).
Ben Yelin: ...Friends in Washington could step in and change the anti-terrorism statute. And they could say it doesn't necessarily have to have this transcend national boundaries element to it in order to hold the tech companies potentially liable. There seems to be a lot of hemming and hawing about these tech companies. They're not exactly popular in D.C. among members of either political party. You'd think this is a perfect opportunity to stick it to them...
Dave Bittner: Right.
Ben Yelin: ...Change the statutes so they could be held liable, even if this isn't, you know, al-Qaida or ISIS itself committing this act of terrorism.
Dave Bittner: Interesting. Yeah. I mean, I guess it's a shot across the bow for a lot of the Big Tech companies. Part of me wonders, are we in tricky territory here when someone claims inspiration? You know, but as you said, you know, ISIS claimed the event, whether or not, you know, after the fact (laughter)...
Ben Yelin: Right.
Dave Bittner: ...Right? - if they said, oh, hey, look; that happened, that's good for us. Yes, we'll take credit for that.
Ben Yelin: That was us. Yeah.
Dave Bittner: Right.
Ben Yelin: I mean, that's what is always sort of confusing with these cases. We see this a lot, where of course ISIS is going to want to take credit for attacks that happen in the United States because it makes them look more powerful.
Dave Bittner: Right.
Ben Yelin: Any person can be affected by propaganda online. Most people who take in that propaganda don't commit acts of - you know, of this magnitude, killing...
Dave Bittner: Yeah.
Ben Yelin: ...49 people at a nightclub.
Dave Bittner: Yeah.
Ben Yelin: But yeah, I mean, at the very least, you could say he has buy-in from an international terrorist organization. I mean, I think where the court's coming from, and I certainly understand this argument, is it's not like he was at an ISIS training camp for 10 years, and this was...
Dave Bittner: Right. Right.
Ben Yelin: ...Part of a broader plot against the United States. He wasn't working with Bin Laden or Khalid Sheikh Mohammed, you know, or any of the major players to plan, like, a 9/11-style attack.
Dave Bittner: Yeah.
Ben Yelin: He was a guy who was radicalized, you know, in his little micro way, was going to attack the United States by doing what he could, which is taking advantage of his access to firearms and killing lots of people in a club. But it wasn't part of a broader terrorist transnational plot.
Dave Bittner: Yeah.
Ben Yelin: You know, it's up to Congress whether the anti-terrorism statutes should apply in those circumstances. I just think this is an opportunity that's ripe for them to expand the application of the Anti-Terrorism Act.
Dave Bittner: Interesting. Interesting. All right. Wow. Yeah, that's - there's a lot of wheels spinning on that one, isn't there?
Ben Yelin: It's a fascinating case. And you know, I think we're going to see more cases of this. It takes so long to make it through the court system. I mean, if you'll recall, that attack was now over five years ago. So you know, we may not get a satisfying answer on this question for a while, if we ever get one.
Dave Bittner: Yeah. All right. Well, my story this week comes via the EFF - the Electronic Frontier Foundation. They posted this on their own website, written by Jennifer Lynch. And they're following up on some news coming out of Colorado, where the Colorado Supreme Court has ruled that three months of warrantless video surveillance violates the Constitution. And this centers around a case called The People v. Tafoya.
Dave Bittner: And basically what happened here - this is sort of follow-up on something we've covered here before. Police had received a tip about some drug activity, and they put a camera on a utility pole across from a gentleman, Rafael Tafoya - across from his home. And they were able to see his front yard, his driveway and his backyard. Now, his home had a 6-foot-high privacy fence. So people walking by the house couldn't look in his yard...
Ben Yelin: Right.
Dave Bittner: ...Couldn't look in his house, couldn't - you know, so he was clear. His intentions were...
Ben Yelin: Unless an NBA center came by. Then maybe they could, you know.
Dave Bittner: (Laughter) Right. Right. Exactly. Right. His - but his intentions were clear...
Ben Yelin: Yes.
Dave Bittner: ...Right? - to have a certain degree of privacy in his home. But having this camera up on a utility pole meant that police could see in, and they had remote control over this camera. Could pan, tilt and zoom, and they could store their footage indefinitely.
Dave Bittner: So Tafoya was arrested for drug trafficking. And at trial, his counsel moved to suppress all the evidence resulting from warrantless surveillance, saying that it violated the Fourth Amendment. The trial court denied the motion, and he was convicted on drug trafficking. The Court of Appeals reversed. They agreed with Tafoya that the surveillance was unconstitutional.
Dave Bittner: So just recently, Colorado Supreme Court upheld the Court of Appeals' opinion. And they found that continuous long-term video surveillance violated his reasonable expectation of privacy. And there's a quote from the court. They said, "put simply, the duration, continuity and nature of surveillance matter when considering all the facts and circumstances in a particular case, the court held that 24/7 surveillance for more than three months represented a level of intrusiveness that a reasonable person would not have anticipated."
Dave Bittner: So OK. That's interesting. But I think additionally what's interesting is that now we have some courts from around the country weighing in on this. This article points out that Massachusetts - their Supreme Court has agreed with this ruling. They've had a similar ruling of their own. But some other courts have ruled the opposite. The Seventh Circuit held that a pole camera was fine for 18 months, that it didn't violate the Fourth Amendment. The First Circuit overturned a district court's decision that eight months of pole camera surveillance violated the Fourth Amendment.
Dave Bittner: So I'm interested in your take on this, Ben. Now that we're seeing disagreements from the circuits, where do we go next with this? Does this go to the Supreme Court?
Ben Yelin: It very well might. And frankly, I think it's the Supreme Court's responsibility to resolve this issue. And I will explain why.
Dave Bittner: OK.
Ben Yelin: So in order for there to be a search under the Fourth Amendment, a person has to display a subjective expectation of privacy. And that expectation has to be one that society is willing to recognize as reasonable. In this case - I think one of the determining factors in this case is the fact that Tafoya exhibits - strongly exhibited that subjective expectation of privacy by building a 6-foot fence...
Dave Bittner: Right.
Ben Yelin: ...To protect his private property. The implication of that is perhaps if he didn't build a 6-foot fence, if he didn't display that subjective expectation of privacy, maybe this would not have been a Fourth Amendment search, and thus would not have been an unreasonable search and seizure.
Ben Yelin: So the Electronic Frontier Foundation is concerned that this case was more decided on that particular fact that he had set up this 6-foot fence. And I think they want a broader opinion by the Supreme Court or perhaps other federal courts to have some sort of uniform rule as it relates to utility pole cameras - if not utility pole cameras themselves, this type of invasive, long-term surveillance.
Ben Yelin: You read what I think is the money quote from this case, which is, "the duration, continuity and nature of surveillance matter when considering all the facts and circumstances in a particular case." The problem is that the Supreme Court cases that are relevant on this issue - and they mentioned U.S. v. Jones and Carpenter v. United States - is that those terms - duration, continuity and nature of surveillance - aren't specifically defined. They're not properly defined. We know bits and pieces of what counts as a search in terms of duration and what doesn't.
Ben Yelin: So in Carpenter, it was seven days of historical cell site location information, for example. But they didn't come down with a hard-and-fast rule on duration, saying seven days - not OK, eight days - OK. It was the same way in the United States v. Jones. So we don't exactly know where the dividing line is, which means that state courts, as in this case, and other federal courts have to take the general guidance from Jones and Carpenter and try to apply it to the specific circumstances.
Ben Yelin: I think it was properly applied here. I mean, three weeks of monitoring somebody's house certainly seems like a long duration. It's pretty invasive surveillance when you can zoom in, zoom out, pan and store the video indefinitely.
Dave Bittner: Yeah.
Ben Yelin: I certainly think all of those qualify. But we've had, as you say, courts across the country disagree on this issue. And I think they're disagreeing because there isn't proper, specific guidance from the Supreme Court properly delineating what counts as too long for the purposes of a Fourth Amendment search, what counts as too intrusive. I don't know if we're ever going to get a satisfying answer to that question.
Dave Bittner: Yeah. That was my next question - is would the Supreme Court be interested in making that a bright line?
Ben Yelin: I mean, there's nothing in the Constitution that would allow them to draw a line at any particular duration. But they do do that stuff all the time in all other types of cases.
Dave Bittner: OK.
Ben Yelin: I mean, there's nothing in the Constitution about abortion and/or pregnancy. But the entire Roe v. Wade decision was divided by trimesters of pregnancy. So if they want to make things up in terms of a relevant duration, they've certainly done that in the past in a variety of circumstances. It's not the best way to do it.
Dave Bittner: Right.
Ben Yelin: You know, the potential solution is looking at something besides duration, something that's a little bit more tangible, in order to come up with a dividing line for what counts as a search and what doesn't. And I think the invasiveness of the technology itself might end up being that determining factor. So the Supreme Court might have to declare something like - a utility pole surveillance is the type of technology that used to - that, you know, prior to its existence would have required vast law enforcement resources.
Dave Bittner: Right.
Ben Yelin: You would've had to have a dude climbing a tree.
Dave Bittner: (Laughter) Right. Boy, that lineman has been working on that electrical connection there for a long time now (laughter).
Ben Yelin: Very long time. Yeah. He must have a significant upper body strength...
Dave Bittner: Right (laughter).
Ben Yelin: ...To be up there. So that's what would have been required in the past.
Dave Bittner: Yeah.
Ben Yelin: And in order to maintain that equilibrium, it might be wise for the Supreme Court to jump in and say, all right, utility poles, because of the invasiveness of that technology, because you can keep them up there - it's very little effort. I presume it's not much of a cost to law enforcement to keep one camera on a utility pole...
Dave Bittner: Right.
Ben Yelin: ...That that's going to require a warrant. I think this case, because we have a split among state courts and federal courts, is very ripe for Supreme Court consideration. I look forward to covering it on the show when it finally gets there.
Dave Bittner: (Laughter).
Ben Yelin: I might have more gray hair than I do now...
Dave Bittner: Yeah.
Ben Yelin: ...At that point. But I do anticipate that that's a decision they're going to have to make.
Dave Bittner: All right. Yeah. Interesting, for sure. All right. Well, we'll have a link to that. Again, this is coverage from the EFF. So obviously, they're coming at this from their own point of view (laughter).
Ben Yelin: Sure.
Dave Bittner: But I think, overall, their coverage of it is pretty fair and, you know, covers the facts. So I think it's worth sharing. We'll have a link to that in the show notes.
Dave Bittner: We would love to hear from you. If you have a story that you would like us to cover or a question for me or for Ben, you can write us. It is caveat@thecyberwire.com.
Dave Bittner: All right. Ben, we are in for a treat this week. I recently had the pleasure of speaking with author and journalist Steven Levy. I will admit to being a bit of a fanboy when it comes to Mr. Levy.
Ben Yelin: We're all fanboys of something, Dave. No shame.
Dave Bittner: (Laughter) No. I've just - I have enjoyed his writing for a long time. In particular, his book "Hackers: Heroes of the Computer Revolution" was just a book I absolutely devoured when it came out. And if you're interested in the history of hackers and computers and the folks who were part of that first wave of folks using computers and figuring out how to exploit them, it's a must read.
Dave Bittner: His most recent book is titled "Facebook: The Inside Story." And we reached out to him in response to this series of articles that The Wall Street Journal put out recently, digging into some things going on at Facebook. And my conversation with Steven Levy specifically centers around the research teams. Here's my conversation with Steven Levy.
Steven Levy: Well, research started pretty early in Facebook's history. They were watching what people did almost from the get-go. But in 2006, they hired a really bright person named Jeff Hammerbacher to make the - all the data very, you know, easy to search. And, you know, he created this infrastructure that allowed them to take the data and do all kinds of research.
Steven Levy: And they began to hire social scientists and statisticians to make the research more - in a more organized fashion. Interestingly, research was part of the organization of Facebook that was devoted to growth.
Steven Levy: So a lot of the research was devoted to ways that people would stay on longer to Facebook, and help them discover not only ways that people might use Facebook better - a lot of companies in Silicon Valley use researchers to test how well you use the product, what you might want to do in a product that you can't do, what you have difficulty doing. But in terms of Facebook, they also figured out how the algorithm would work to keep you using it more.
Steven Levy: And one of the big breakthroughs that happened in research was when they discovered how things can go viral in the system. And they published a paper on it called "Gesundheit" because like a sneeze, certain things can go viral. And they thought that was the greatest thing ever. And they never realized, the researchers who published that, that really is the key not only to fun things going viral, but, as it turns out, some things that create anger or divisiveness or just misinformation that's harmful. So the research is sort of a mixed bag there.
Dave Bittner: As this sort of information has come in to Facebook, as they've realized that, you know, not everything they do in their day-to-day business is a net positive, how have their researchers responded?
Steven Levy: I've gotten to spend a lot of time with some of the researchers, both talking on the record, and I would see them at conferences, and we would talk more confidentially. And a lot of the people who are Facebook research are people who could have had terrific jobs in academia. They're really accomplished, people with Ph.D.s.
Steven Levy: And what excites them is not so much publishing because the opportunities for publishing, while they exist at Facebook, are not abundant because, you know - the reason I can get into a second. They've been overly cautious about it. But they have a chance to affect the product - the product that billions of people use. And if you're in academia, you usually don't have that opportunity. You publish something, and maybe your peers will see it. But you certainly don't have a chance to change the world.
Steven Levy: In recent years, it's gotten more difficult to argue that that's a great thing to work for Facebook, because so much has come out about Facebook and the way it can be harmful. And I think the mindset has changed among some of these people to think, well, maybe we can mitigate this. Maybe we could do research that unearths things that will make Facebook less toxic.
Dave Bittner: Yeah. That's a fascinating aspect to me. Because I can't help thinking, you know, sort of that old phrase, when good people work for bad companies. You know, I'll admit I'm not particularly a fan of Facebook myself. How much of that is going on within the company? I mean, it sounds like these folks realize that there are problems and issues. But I guess what you're saying is they feel as though maybe the best way to fix it is from the inside.
Steven Levy: Yeah. Well, it's almost impossible to fix it from the outside because Facebook does what it wants. So...
Dave Bittner: Yeah.
Steven Levy: You know, the regulators have tried, and there's been a lot of pressure on Facebook to change it. And that really doesn't move Mark Zuckerberg, who's the person who makes all the decisions at Facebook. The ultimately big decisions come down to Mark Zuckerberg. So it's an interesting thing.
Steven Levy: I mean, there are people who work on the way a product works. There's a lot of people that work on research to help Facebook get more revenue - you know, what works to make ads more attractive to people. And then there's research that goes on about misinformation and security and what goes on there.
Steven Levy: And The Wall Street Journal - it was really fascinating, that report they did, because they had a series of presentations that the researchers gave to show Facebook where it was failing and implying, of course, that Facebook should improve on those areas. And in the cases that you saw in the Journal, these presentations went up to the very top. And ultimately, Mark Zuckerberg, you know, and his people around him decided not to take decisive steps to mitigate the problems that the researchers were finding - things like how Instagram was creating mental health problems in teenage girls.
Dave Bittner: Is this ultimately just about growth and money and profits? I mean, why do you suppose they're so hesitant to make meaningful changes here?
Steven Levy: Well, growth is the North Star at Facebook. In my book, I devoted a lot of time to tell, for the first time, the story of Facebook's growth circle, they called it, which used all kinds of means, some of them pretty dicey, to get and retain users. And that is the key to Facebook, really.
Steven Levy: And money is important because, you know, that enables Facebook to spend money to grow more and retain more users. It is connecting the whole world, which is important to Facebook. And to do that in light of competition from places now like TikTok. That draws people away from Instagram and Facebook. Probably those TikTok users aren't using the Facebook main app anyway. And you know, there's only a certain amount of time people spend in a day. So that is really important.
Steven Levy: And as it turned out, when push came to shove in certain ways, the way Zuckerberg chose to look at it was to say, wait a minute. A fifth of our users - the teenage girls using Instagram - you know, it makes them feel bad. It aggravates their mental health problems. That means, like, four-fifths are doing great, right? So let's go with that. But obviously, a fifth of the teenage girls who use Instagram represent millions of people, quite literally.
Steven Levy: So there's something really wrong if your researchers come to you - and if you look at these slides, it's almost like they're begging the leadership of Facebook to do something about it. You know, you're saying, our product is making millions of teenage girls feel bad. And some of them with mental health problems are saying these problems aggravated by it. That's a serious problem for a company - to make the lives of millions of teenage girls miserable or worse, even. You would think that all steps, any step possible, will be taken to change that situation. But in this case, at least according to the Journal reporting, those steps weren't taken. They were saying, well, you know, gee, if we change that, people will use Instagram less.
Dave Bittner: How do you reconcile this? I mean, obviously in the work that you've done for your book, "Facebook: The Inside Story," you spent probably more time than any other journalist with the top-level people at the company. What is your take on how they think about these sorts of things? I mean, is this - do they have blinders on?
Steven Levy: Well, I think they - it's not so much blinders. It's what they prioritize. And there is a belief - and I think they truly believe it - that overall, Facebook is good for society and good for people. And it is an unprecedented situation that the company is in. No one has ever built a social network to get a really significant chunk of the world's population on it at the same time and allow anyone on that network to post things that any other person in the network can see and sometimes many, many people in the network can see. That's something that - it's happened for the first time, and it opens up problems that no one's had to deal with before.
Steven Levy: But given that, when you see that unique situation develop in a way that's causing misery and danger to many, many people, it's not enough to say, well, on balance, we're doing good. And I think they don't shift from that. You know, they see it more like it's an inconvenience, or you could wave that off and think that's OK.
Steven Levy: But it's more like a plane crash than an inconvenience when you have people feeling bad, right? We tolerate some things that we don't like, you know, like crowded subways, right? People get into a crowded subway. You don't have to stop crowded subways - at least, you didn't before COVID. But plane crashes or subways that - you know, like, once a day, you have a subway, like, crash, and people burn up - we wouldn't tolerate that. Even if you would say, hey, we ran, like, you know, 10,000 subways today. Only one crashed and burned everyone.
Dave Bittner: You know, Facebook is well-known to be insular with their research, not wanting to share their results and not wanting to allow outsiders to have access to their data. Were they more open in the past?
Steven Levy: Yeah. There was one moment that happened in Facebook. They did a study in 2012. It's called the emotion study, where they were testing what happened if you put some, you know, kind of negative posts on there. Would people use Facebook less? And you know, people charged that they were using people as guinea pigs without their knowledge. And some of them, you know, wound up using it, you know, slightly less, but - and showed that maybe they were less happy. It wasn't like they went into, like, a chronic tailspin depression. But they felt a little worse about themselves. And it got a lot of negative publicity. And you know, from then on, Facebook was much more cautious about what they published.
Steven Levy: Facebook is also being super cautious about sharing data with researchers that want to come up with, you know, answers to the questions of whether Facebook is causing mayhem in society, whether, you know, the misinformation has an effect on, you know, politics or on voters. And this is stuff that's important for society to know. But Facebook, though sometimes it promises access, sometimes, in the recent years, has been finding reasons to pull back that kind of cooperation.
Steven Levy: And I think that's partly, you know, because of the privacy problems they cite about sharing information. But there's ways to address that. It really looks like the PR aspects is something that motivates Facebook. Now, when your own researchers come up with stuff, you don't have to publish it. You don't have to share it. Though in this case, we got to see it because there was a leak.
Dave Bittner: Yeah, which I think is interesting in itself that - I mean, could - I'm speculating, of course, but could indicate that there is perhaps some unrest within the organization.
Steven Levy: Well, one researcher actually did, after the Wall Street Journal post, a Twitter stream showing that he was dissatisfied. I think he's still working there. I haven't heard that that this person has been fired. But my suspicion is that he voiced a dissatisfaction that was somewhat broader than it might have been a couple of years ago.
Dave Bittner: Where do you suppose things have to go for us to see meaningful change here? Is this something where we could see - if Facebook doesn't make effective change to themselves, perhaps we'll see some regulation?
Steven Levy: I think it's more likely the more we see leaks like this coming out, which isn't - let's say it's in the category of shocking but not surprising. People don't really expect Facebook to be dealing honestly with them anymore. Certainly the legislators that have been trying to get information out of them, the regulators, don't think that. There's a whole class of skeptics and critics of Facebook who wouldn't be surprised by this.
Steven Levy: The independent board that Facebook set up, whose job it was basically to rule on decisions that Facebook made that people are challenging, overstepped their charter intentionally and said, wait a minute. We want to get into this. We want to look into this. So they're going rogue in a way, which is kind of interesting.
Steven Levy: I think ultimately, this pressure is going to lead Facebook to make some changes, maybe not willingly.
Dave Bittner: All right, Ben, what do you think?
Ben Yelin: So part of me - I want to continue to have a Facebook profile. And in case any of the powers that be are listening, I want to be careful in what I say.
Dave Bittner: (Laughter) OK.
Ben Yelin: Zuckerberg is just, to me, off the rails in terms of his priorities. And what's happened with this research group is - based on this interview, is just kind of tragic to me. I mean, they've compiled evidence of how things go viral, how the things that go viral can be damaging. And Zuckerberg seems primarily concerned still with the bottom line and making excuses on behalf of Facebook.
Dave Bittner: Yeah.
Ben Yelin: And I just don't see that changing any time soon. It really reminded me of - and I think you've made this comparison before - what we went through with the tobacco companies over the last, you know, maybe 30, 40 years...
Dave Bittner: Right.
Ben Yelin: ...Where they kind of tacitly acknowledge that their product is dangerous and could potentially be harmful, but there are ways of obfuscating that and, you know, saying it's not really the fault of the tobacco. It's the smoke. Or...
Dave Bittner: Right. Right.
Ben Yelin: You know, it's those types of things where they're evading responsibility. And with all due respect to Mr. Zuckerberg - and he's a man of remarkable accomplishment. Please don't delete my Facebook profile.
Dave Bittner: (Laughter).
Ben Yelin: I just really see a similarity there. So it was an eye-opening interview. It makes me really want to read the book, and I'm glad you did it.
Dave Bittner: Yeah. The thing - I think what really sort of turned a light bulb on for me was when Steven said that, you know, his take is that the folks at the top levels of Facebook really do believe that they're making the world a better place.
Ben Yelin: Yes. I think they do legitimately believe that.
Dave Bittner: Yeah.
Ben Yelin: And I don't think they're entirely wrong. I mean, they are fostering some things that are very productive - keeping us interconnected...
Dave Bittner: Right.
Ben Yelin: ...Allowing us to, you know, stay close with friends and family. I think that sort of 2004 to 2007 version of Facebook was very admirable. But then Frankenstein got out of the lab.
Dave Bittner: Yeah.
Ben Yelin: And we've gone to a place where it's much more harmful.
Dave Bittner: Yeah. It does strike me - for me personally, I see it as being sort of a moral failure that - you know, how much money do you need? How much growth do you need? You have the No. 1 social network in the world, you know, or certainly - you know, depending on how you measure it. And if you find that it is doing harm, you've got to fix that. You need to take that seriously. If people are being harmed, if people - if teenagers are taking their own lives...
Ben Yelin: Right, this mental health issue among Instagram users.
Dave Bittner: ...How do you not immediately say, OK, stop. Everybody, just stop. We need to take a serious look at this, and we're going to put growth on hold while we figure out how to first do no harm. I will never understand the priorities that keep them from doing that.
Ben Yelin: Dave, as the great Bob Dylan once said, all the money you make can never buy back your soul.
Dave Bittner: (Laughter) Right. Right.
Ben Yelin: Please don't delete my Facebook profile.
Dave Bittner: (Laughter) Right. All right. Well, again, our thanks to Steven Levy for taking the time for us. A real treat for me personally to get to chat with him. Again, his book is titled "Facebook: The Inside Story." And for his latest writing, you can always head on over to WIRED and see what he's up to there. Always time well-spent checking out his writing.
Ben Yelin: For sure.
Dave Bittner: All right. That is our show. We want to thank all of you for listening.
Dave Bittner: The "Caveat" podcast is proudly produced in Maryland at the startup studios of DataTribe, where they're co-building the next generation of cybersecurity teams and technologies. Our senior producer is Jennifer Eiben. Our executive editor is Peter Kilpe. I'm Dave Bittner.
Ben Yelin: And I'm Ben Yelin.
Dave Bittner: Thanks for listening.