|
I first published the below on Linkedin here on 2025-01-25.
The PKI Consortium held a conference on Post-Quantum Cryptography on Jan 15-16. Let me share some background and my takeaways.
Quantum Computers (QCs) could perform some calculations much faster than any traditional computer. Not a mere thousand times faster, million times or billion times faster, but radically faster ⏩⏩, providing 'efficient' solutions for math problems we cannot hope to solve with today's computers. They are specialized devices, you cannot browse the web or play 🎮 video games on a QC. You are unlikely to ever have one in your home, as they require very special physical conditions to function. However, they are likely to bring major breakthroughs in areas like optimization algorithms or machine learning, so they are researched extensively. They are also likely to reshape cryptography.
A large-scale Quantum Computer (which does not exist yet) shall allow faster attacks against cryptographic algorithms:
-
Symmetric key cryptography algorithms we use today are likely to remain secure 🔒 if used with long keys (e.g. AES-256). Some shorter keys (e.g. 128 bits) will no longer be secure vs QCs (with Grover's algorithm providing a quadratic speedup). This is a major effect, but not earth-shattering.
-
Meanwhile, QCs will have a devastating effect on public key cryptography we use today, as both the RSA and ECC algorithms can be efficiently broken with a QC (with Shor's algorithm yielding exponential speedup). QCs are going to render today's digital signatures and key establishment protocols insecure ⚡ (and thus certificates, PKI and TLS, etc), so today's public key crypto algorithms shall need to be replaced.
Post-Quantum Cryptography (PQC) is about migrating to stronger, quantum-resistant crypto algorithms (which will remain safe in the age of Quantum Computers).
It may take years or decades until a cryptographically relevant QC becomes reality. Technical problems seem solvable, one speaker at the conference suggested that how far QCs are merely depends on how badly people need them and how much money they are willing to spend. The recently announced Willow chip is one step towards scalable QCs, it does not turn anything upside down. QCs are not a direct threat today, and it will take time from the first scalable QC until your adversaries will put their hands on one too. However, preparing for them is not a problem of the far future. There are attackers already collecting and storing encrypted data, hoping they will be able to decrypt them when a QC becomes available. (This is called the 'harvest now, decrypt later' attack ⚡.)
Replacing crypto algorithms is hard. First, the new algorithms need to be researched. Second, they need to be standardized, to allow interoperability. Once standards are ready, devs can create software implementations, but you can only use them, when both/all sides of your protocol support the new algorithm. (I recall migrating TLS to SHA2-based certs: a very large part of the Internet had to support SHA2 before people could even start installing SHA2-based certs on their servers.) Even if you are already using the new algorithms, legacy implementations may still opt to downgrade to the old ones (and attackers will do the same). Once there is a critical mass out there who supports the new algorithms, then you can enforce the new and disable the legacy algorithms -- this is the point when you are secure 🔒.
Past crypto migrations took over a decade 😮, and some speakers even questioned if they ever ended 😄 (yes, SHA1->SHA2 was hard). Still, if you would like to keep your secrets safe for X years, and it would take Y years to migrate to quantum safe algorithms, and quantum computers are likely to arrive in Z<X+Y years, then you are already too late 😵. (See: Mosca's theorem.)
The standardization of PQC algorithms just concluded. NIST ran a process for selecting the new PQC algorithms from 2016, and released standardized quantum-safe public key algorithms in 2024:
- ➡️ FIPS 203 ML-KEM (based on CRYSTALS-Kyber), a lattice-based key encapsulation mechanism. This is the only one that can be used for key establishment (like in TLS), the rest are for signatures.
- ➡️ FIPS 204 ML-DSA (based on CRYSTALS-Dilithium 🖖), a lattice-based signature algorithm. This is intended to be the go-to signature algorithm.
- ➡️ FIPS 205 SLH-DSA (based on SPHINCS+), a stateless hash-based digital signature standard, which is meant to be backup for the previous one. It is not lattice-based but follows a different, more conservative math approach of hash-based signatures (and also unique as its name is not coming from a sci-fi franchise 😜).
- ➕ There is one more signature algorithm (FN-DSA, based on FALCON 🦅), which is going to be standardized in the future.
NIST has also published a timeline (see: NIST IR 8547) for transitioning to the new PQC algorithms, detailing for how long each current/legacy algorithm is usable. The transition is expected to be completed 🏁 in 2035.
My key takeaway was also called out by the NSA presenter: standards are ready, the clock is ticking, time to roll up your sleeves and work on how you get to PQC.