# Quantum-proof and ready: NIST unveils the future of encryption.

**Dave Bittner:** Thanks for joining us for this CyberWire special edition. Dustin Moody is a mathematician at NIST and recently N2K's Brandon Karpf sat down with Dustin to discuss the first three finalized post-quantum encryption standards. Here's their conversation. [ Music ]

**Brandon Karpf:** I'm joined today by Dustin Moody, Supervisory Mathematician at the National Institute of Standards and Technologies. And Dustin's here today to fill us in on the recent standards released by NIST around post-quantum cryptography. Dustin, great to have you on the show. Really excited to have this conversation.

**Dustin Moody:** Great, happy to be here.

**Brandon Karpf:** So could you fill us in just on the background of NIST's project for post-quantum cryptography and where we've gotten to today, and ultimately what the goal is for the program?

**Dustin Moody:** Yeah, certainly. So since the 1990s, cryptographers and others have been aware that if a large-scale quantum computer could be built, it would break some of the crypto systems that we rely on to protect our information. Back then, it was mostly a theoretical concern because quantum computers were imagined, they weren't realities. But since then, different companies and organizations have been working on building them because they would bring a lot of positive benefits to society. They could do a lot of things that our current computing technology cannot. At NIST, our particular group deals with cryptography and we approve the algorithms that the federal government uses to protect all of our information. So we were aware of this. Probably around 10 years ago we started scaling up our project a little bit, because we saw the progress in quantum computers was growing and that they were becoming larger. They're not large enough to threaten current cryptographic levels, but we need to get these standards in place well in advance of that. So at NIST, we started building our team and our expertise, and we eventually decided that the best way to create new standards, to get new crypto systems in place would be to do a large international competition-like process to select algorithms that we would evaluate internally and the cryptographic community could also evaluate. And this has done this sort of thing in the past and it has gained a lot of acceptance, a lot of credibility because people can trust the algorithms that come out of this because they've been so well studied. So we announced that back in 2016 that we would be doing this. In response, we received a large number of submissions. We had a total of 82 that were sent into us from different teams around the world who had all designed the best algorithms that they could come up with to provide protection. Over the past eight years or so, we've gone through a series of evaluation and analysis. Internally, both we've looked at them, we've implemented them, checked out their performance benchmarks, and similarly people around the world have been doing the same thing. Some of them were broken along the way. That's what happens. The strongest ones survive, and we have more confidence because they've been studied so carefully. So after a series of three rounds, back in July of 2022, we announced the four algorithms that we would be standardizing as a result of this process. Since that time, it took us a year or two to write up the standards for those algorithms, but that's where we are now.

**Brandon Karpf:** Great. So, could you walk us through those standards? There was just in the last few weeks three that were officially released, and that sounds like a four might be on its way. Can you walk us through these?

**Dustin Moody:** Yeah, so the different, we were looking for two different cryptographic functionalities, one of which is to do key establishment, or you can equivalently do encryption, and another is to do what's called digital signatures, which are used to provide authentication online. We selected a few algorithms for each category. For digital signatures, we selected an algorithm called CRYSTALS-Dilithium, is the main algorithm that we expect people to use. It's based on something called lattices. We can get into all the math if you really wanted to, but most people are just happy to know that it's on something called lattices. We also selected two other algorithms, another algorithm based on lattices that's called Falcon. It has smaller key sizes than Dilithium, but its implementation is a lot more complex. You have to use floating-point arithmetic and many devices might struggle to securely implement it. So it's available for certain applications that really need those shorter signatures, but most applications will be able to use Dilithium just fine. The third signature that we selected is called Sphinx+. It's based on a different idea than lattices. The idea there is to have a backup in case there's some attack or some vulnerability discovered, we have something not based on lattices. It is around for that purpose. However, it's a bit slower and bigger, so it wouldn't work in many applications. But if security were your number one concern, its security analysis is a bit more conservative. So for some users, it might be their choice. So those were the three signatures. We selected an algorithm called CRYSTALS-Kyber for key establishment. It was also based on lattices. Over the course of the past eight years, lattices turned out to be the most promising area for post-quantum algorithms. It has great performance, has great security. So it was selected. Now, three of those were standardized. That's Kyber, Dilithium, and Sphinx+. They came out in documents that we call FIPS, Federal Information Processing Standards. And they first went out for public comment. We had a draft form. We got some feedback, made a few small changes, and then just a week ago we published them in their final form so that people can begin to use them. The fourth algorithm, Falcon, that was selected, we are still writing the standard. It's not yet done. We wanted to focus on Dilithium first because it's the primary signature we want people to use. And because of the complex implementation, it's just taken us a little bit longer to write the standard and we hope to have it out by the end of 2024.

**Brandon Karpf:** Great. Well, curious here because you've mentioned a couple different techniques and you mentioned lattice and it seems like three of the four are based on lattice techniques. What about that approach to cryptography makes it inherently more secure against quantum-based attacks?

**Dustin Moody:** Yeah, so that was over the past several years. Well, people have been studying that exact question you asked, about how can we protect against quantum computers? And the cryptosystems we use today, turns out they're all based on hard mathematical problems that it is difficult for a computer to solve. So an algorithm that we use today, known as RSA, its security relies on the fact that if you have a really, really large number, it's hard to break it down into its prime factors. So for quantum computers, what we needed to find were algorithms that were based on hard problems that quantum computers are not known to be able to solve any faster than classical computers. And mathematicians in computer science have studied that for a few decades and found a few different promising areas. Lattices were one. There's a couple of hard problems associated with lattices, known as the shortest vector problem or learning with errors. Another family of algorithms are based on what's called error-correcting codes. A third is based on multivariate algebra. And for each of these, it's hard to describe exactly why a quantum computer can't, doesn't seem to be able to break them, other than very smart scientists who understand quantum algorithms have tried their best. They've looked at all the known algorithms and none of them seem to provide any avenue of attack that would break these. We have no absolute guarantee, but that's the case in current cryptography. It could be the case, you know, some brilliant person comes up with a new idea that breaks what we're using today.

**Brandon Karpf:** Sure, that's science and that's discovery. That's how that works. I'm curious about the threat space that we're talking about here. So, you know, obviously there's been a motivator. You discussed this a little bit, even dating back to the '90s, theoretically we were thinking about the possibility of quantum computations, being able to do things like factor large numbers into their primes faster than traditional computers. And you just discussed with lattice-based techniques, the ability to make that a hard problem, right? Not something that even a quantum computer could accomplish or could break in a reasonable amount of time. Could you talk a little bit about the threats right now, though, that -- or right now or in the near future that we would be facing if we don't move forward implementing these post-quantum or quantum-resistant encryption algorithms?

**Dustin Moody:** The online world that we live in, you know, we buy things online, we send emails that have information that should be protected, we've got our medical records, all of this is behind the scenes, protected by cryptography, that most people don't think too much about because it's all taken care of in your browser or by whatever application you're using to send that information, but it really is all dependent on secure cryptographic algorithms. There's many different types of cryptographic algorithms that are used. We have some that are called public key algorithms, some that are called symmetric key algorithms, and that threat I was talking about from quantum computers relates directly to the public key crypto systems which are employed today.

**Brandon Karpf:** Okay.

**Dustin Moody:** There is an impact to the symmetric key algorithms from quantum computers, but it's not as drastic. It won't completely break those algorithms. It'll mean at most we need to use longer key sizes for those algorithms, which we can do. That's much more manageable.

**Brandon Karpf:** Although when I think about something like a larger key size, I think about the processing required to implement that and the time it would take to actually accomplish that type of encryption. Is that a consideration?

**Dustin Moody:** It would be annoying, but it's actually not too big of a deal. Right now, the main symmetric key algorithm used to encrypt data is known as AES. Many users use a key size that's 128 bits. That's fairly small. To protect against a quantum computer, it's known that if you went up to AES-256 or just doubling the length of your key, you could use the same algorithm, just a longer key. The performance impact would be pretty negligible because 128 bits to 256 bits really isn't that much when you implement AES. It's on the public key side that we have more of a problem, because Shor's algorithm would completely break every single one of the public key algorithms that we use today. That includes RSA, elliptic curve cryptography, Diffie-Hellman, and so it's those algorithms that we have to replace, otherwise we're going to be vulnerable and people will be able to get access to information that should be protected, you know, if a large-scale quantum computer comes out, and you're not using post-quantum algorithms.

**Brandon Karpf:** Got it. So what, you mentioned Shor's algorithm. Could you talk us through that?

**Dustin Moody:** Yeah, so Peter Shor was a scientist from MIT that back in the 1990s, he was working on algorithms that would run on a theoretical quantum computer. They did not exist. And he came out with an algorithm that what it ultimately does is it finds patterns. It finds a period. The period means the time it takes until something repeats itself. And he was able to come up with an efficient algorithm that, if you had a large-scale quantum computer, would be able to run in like minutes or hours, depending on your particular, you know, how big the quantum computer is. And he noticed that, well, if you can find the periods, that you could then translate these hard math problems that I've talked about, factoring, another one known as the discrete log, you can translate those problems into needing to find the period of certain things. And so when he noticed that, then his algorithm became a quantum algorithm that would solve the discrete log problem efficiently, faster than our current computing technology can do. And that makes it very disruptive for cryptography.

**Unidentified Person:** We'll be right back. [ Music ]

**Brandon Karpf:** When I talk to folks about quantum and the post-quantum age, you know, the sense I get is a lot of people even today still think of this as science fiction. They think of the quantum age as being decades and even potentially centuries into the future. They don't see it as a present threat. What NIST is telling us and what NIST is working on and investing in this program for now eight years and publishing these standards is that this is something that the government and at the highest levels care about and are investing time and resources in. Can you walk us through kind of why it's not really science fiction, why it is a threat today, and the ways in which cybersecurity professionals need to be thinking about this technology?

**Dustin Moody:** So, with regards to science fiction, I mean, there are companies that have built quantum computers that are on the marketplace today and that you can use. IBM has quantum computers that you can interface with online. Now, these are small quantum computers, but they are quantum computers that do things that classical computers cannot. They are not yet at the size where they are solving a lot of the problems that would be really beneficial to society, but it's predicted that within 10 to 15 years they might be able to get there. So we're still not quite there yet, but there's been significant progress made over the past decade or two, and we do have small ones. You asked about the threat. Yeah, the US government is taking the threat of quantum computers being able to break cryptography very seriously. And let me paint a picture that is a little bit counterintuitive as to why. This threat is often known as store now, decrypt later, or harvest now, decrypt later. And it's the idea that your data today, you are actually at risk to a, potentially a foreign adversary who has a quantum computer even though they don't yet have that quantum computer. That sounds a little strange, but the scenario is imagine you've got all your data now and you've encrypted it, it's protected, and suppose maybe it's national secrets that need to be secure for 30 years or something, we'll say. Okay, so you've encrypted it, great. Suppose an adversary is able to get a copy of that data. Maybe you stored it in the cloud or they found a vulnerability. You're not too worried because it's encrypted and they can't read it. But maybe a quantum computer comes out in 15 years and they're then able to get access to that data that you were hoping was going to be secure for 30 years. So, you're actually already at risk today, even though that computer won't come out for 15 years. And that just underscores the need to why we need to have these algorithms and standards in place well before a quantum computer comes out there so that you can make sure your information is protected for as long as it needs to be.

**Brandon Karpf:** So even, you know, with the best guess of 10 to 15 years before a quantum computer of the right size and power capabilities can decrypt our public key cryptology today, we're already a little bit behind the curve.

**Dustin Moody:** Potentially. Now, when we encrypt our data, we don't use public key encryption to encrypt it. Typically we'll use symmetric key encryption. However, how you created that symmetric key you're using to encrypt the data may have been created through public key means. So the exact details of whatever application you're using, you have to look into that to see. But in some circumstances, yeah, you could already be at risk if you have not sufficiently protected your data, you know, using layers of defense to protect it. That's true.

**Brandon Karpf:** So for organizations now looking at these standards, these new standards and the new, you know, NIST-approved algorithms, what goes into implementing them? What's going to be the process like? What's going to be the timeline? What's going to be the requirements for an organization to get their minds wrapped around successfully implementing these algorithms?

**Dustin Moody:** So we hope organizations and companies have been aware of post-quantum for a while now and have been looking into this to understand the threat and to be looking at the algorithms that were going to be coming out so that this wouldn't be a surprise. We know the migration to these algorithms, it will be tricky, it will be complicated, it will be costly. In terms of implementation, I think most people won't be implementing this for themselves. There will be vendors and libraries that implement them, that go through validation and testing to make sure that their products are certified. And so for a lot of people, it will need to be talking to their vendors to find out if their vendors have implemented the algorithms and if it's passed, you know, testing and validation. Another important step organizations need to be thinking about is they need to find out where they're using cryptography. That's a tricky question that most people don't know the answer to. It can require software tools that will scan your systems and find out where's your data that's being protected, what algorithms are protecting it. And because you can't simply migrate to a new algorithm unless you know where you need to, you know where is the crypto being used that you need to migrate. So yeah, there's a lot that goes into it. We recommend people definitely have somebody that's kind of, this is on their plate, that they're leading an effort to think about this and make a plan for their organization. It's going to take time. The standards have come out. It'll probably be a year or two before we start to see lots of products that have implemented it and have been tested and are on the marketplace being sold.

**Brandon Karpf:** Okay.

**Dustin Moody:** Even then, it'll take we estimate 10 years, 15 years for the transition to occur, because this is one threat. People that manage information security, you know, there's many threats, there's many things on their plate, and it takes time to, you can't just flip one switch and this problem is solved. So we expect the transition will probably take 10, 15 years to occur.

**Brandon Karpf:** Got it. Well, so in that time then, what comes next for you and NIST and the program that you're a part of? So we're trying to help make the migration as smooth as possible. NIST has what's called the National Cyber Security of Excellence. It's running a migration to PQC project where we've partnered up with around 40 to 50 industry partners to help develop tools and to come up with guidance and to learn best practices to ease the migration. And it's really great to see these companies are working together to make the migration as smooth as possible. So that's a great resource that people can turn to for information there. For me, we're also still working on standardization. And so we have these first algorithms that were selected, but we know that research is going to continue to evolve. We have a few algorithms that are still in a fourth round of evaluation from our competition-like process. We will likely select one or two of those to standardize within a couple of months, and we'll write the standards for them. Those will be to complement Kyber, algorithm, the encryption or key establishment algorithm that I talked about. We had three signatures but only one of them and so these will add to those numbers. And then also we have another standardization project going on. I mentioned that Sphinx+ is a little bit big and slow for not being based on lattices and many applications might have a hard time using it. So we called for new submissions for signature algorithms not based on lattices that would be better performing than Sphinx+. And in response, we received 40 different submissions. So we're in a multi-year process to evaluate all these and maybe at the end -- That's a big project.

**Dustin Moody:** Yeah, we might select another one or two to standardize out of this. So we're going to be dealing with a lot of fun cryptography for the next several years.

**Brandon Karpf:** Got it. So a lot of research still to be done and evaluating them. I imagine you looking at efficiency and resilience to various attacks. To what extent does the community outside of NIST participate in that type of validation process?

**Dustin Moody:** Oh, we rely a lot on the external cryptographic community. We have a very talented team here at NIST. You know, there's not enough of us to do all the analysis that would be required to look at, you know, the original process had over 80 different algorithms submitted. So we're, we rely on the cryptographic community a lot and we're very thankful for their efforts. They do a lot of research, they do a lot of benchmarking, and they publish their results. There's, there's conferences dedicated to this industry has people that are also doing benchmarking and they're putting these in real world protocols to see how they run, how our timing is affected. And so it's a great worldwide effort that has been very, very collaborative and it's been pretty cool to see. And the main things that we look at when we evaluate first and foremost is security. We need these algorithms to be secure against quantum computers, but also our current classical computers, because those aren't going away. That'll still be the, probably the main computers that people have and use. The second criteria is performance, and we're looking at a lot of different performance benchmarks on a variety of platforms, both servers, cell phones, lightweight devices. We're looking at things like what are the key sizes, what are the signature sizes, what are the ciphertext sizes, how is the bandwidth affected? Many of these algorithms are larger than what we're used to with RSA and elliptic cryptography. And so there could be some challenges for protocols and applications that have to handle the larger keys and larger signature sizes. So those are some of the things that we look at when we're evaluating these algorithms.

**Brandon Karpf:** And we do love to hear about big international collaborative projects like this. You know, it's nice when a whole community gets together around a common mission, especially one of security.

**Dustin Moody:** And to give maybe some numbers to it, we have an online mailing list called the PQC Forum, where we have a lot of discussions and questions, and there's over 3,000 members on that PQC Forum. Many are there just to listen, but there are several hundred that are active and post questions and answer questions and things like that. So it's definitely a large community.

**Brandon Karpf:** Love to hear it. Love to see it, too. Dustin, well, you mentioned a couple resources. We will have links to those in the show notes for this episode, NCCOE and some of the other work that you've mentioned. Really appreciate you coming on.

**Dustin Moody:** Thank you. Thanks for having this conversation and helping spread the word.

**Dave Bittner:** That's Dustin Moody, mathematician at NIST, speaking with N2K's Brandon Karpf. You can find a link to the newly released standards in our show notes. Thanks so much for joining us. We'll see you back here next time. [ Music ]