Dave Bittner: And joining me once again is Malek Ben Salem. She's the senior R&D manager for security at Accenture Labs. Malek, it's always great to have you back. I wanted to touch base with you on the news we've been seeing lately when it comes to facial recognition systems. I wanted to get your take on where are we, what's the technology, where do things stand.
Malek Ben Salem: Yeah, so our facial recognition technology has spread widely over the last decade, especially due to advances in big data, deep convolutional metrics and the graphics processing units, or GPUs. And we see them being used widely. You know, most people know them from social networking platforms where pictures or people - people's faces get tagged. They're used for, you know, to spot missing people, to catch slackers who lie about the hours they spend in the office. Most recently, they've been deployed at the - I believe the Hyderabad Airport. So you can use your face now as your boarding card. So the - you know, the uses continue to be - to grow thanks to the advances in computational power and to deep learning. But there are issues with the technology itself.
Dave Bittner: What kind of concerns are you tracking?
Malek Ben Salem: Well, there's obviously the privacy concern, the fact that these technologies are being used everywhere, not necessarily with people's consent. As a matter of fact, just last week, one school was fined in Europe because it used facial recognition systems to track the presence of students in their school. So this was in Sweden. And, you know, about a 20,000 Euro fine was issued against this school because of that use.
Malek Ben Salem: But beyond the privacy concerns, facial recognition systems, just like any machine learning systems, you know, reflect the data that they get trained with. And because a lot of the data that they were trained with was not reflective of entire populations, they end up having biased results. So no matter how - accuracy improvements they've been able to achieve, overall, across, you know, the widest population, for certain demographic groups, they don't perform as well, which makes them not reliable. So if we think about uses in law enforcement, for instance, to match certain faces with people of interest or people who have committed crimes before, then it has been noted that, you know, certain demographic populations are more likely - or people from those demographic populations are more likely to be matched to the faces of interest.
Dave Bittner: Yeah. It seems like that's a high-risk proposition there, where that's a situation where it's really important to get it right.
Malek Ben Salem: Yeah, absolutely. Absolutely. And that is why we need to take a look back at the data sets that are used to train these facial recognition systems to address this bias problem, address this false positive problem when dealing with watch lists.
Dave Bittner: Now, is this something that you think, as time goes on, the reliability is going to improve or are we ever going to see these get to the point where we feel like we can trust them?
Malek Ben Salem: I think so. I think that technology will continue to improve. For instance, we know that, up to this point, these systems have had difficulty distinguishing twins. But they can be complemented with certain techniques so that they're able to distinguish the faces of twins, for instance, by looking at, you know, pores within the twins' faces, and, you know, computing the distances between (laughter) those pores, they may be able to get additional information or additional - build additional discriminative power between the faces of twins. Other things that can be leveraged is how the people walk. If we're not just looking at the face of the person, but at the, you know, entire video of a person walking or moving, then we're able to improve the accuracy of these algorithms and these systems that way.
Dave Bittner: All right. Well, it's something that'll continue to develop and certainly merits keeping an eye on. Malek Ben Salem, thanks for joining us.
Malek Ben Salem: Thank you, Dave.