Welcome to LWN.net
The following subscription-only content has been made available to you by an LWN subscriber. Thousands of subscribers depend on LWN for the best news from the Linux and free software communities. If you enjoy this article, please consider subscribing to LWN. Thank you for visiting LWN.net!
The Internet Engineering Task Force (IETF) is the standards body responsible for the TLS encryption standard — which your browser is using right now to allow you to read LWN.net. As part of its work to keep TLS secure, the IETF has been entertaining proposals to adopt "post-quantum" cryptography (that is, cryptography that is not known to be easily br…
Welcome to LWN.net
The following subscription-only content has been made available to you by an LWN subscriber. Thousands of subscribers depend on LWN for the best news from the Linux and free software communities. If you enjoy this article, please consider subscribing to LWN. Thank you for visiting LWN.net!
The Internet Engineering Task Force (IETF) is the standards body responsible for the TLS encryption standard — which your browser is using right now to allow you to read LWN.net. As part of its work to keep TLS secure, the IETF has been entertaining proposals to adopt "post-quantum" cryptography (that is, cryptography that is not known to be easily broken by a quantum computer) for TLS version 1.3. Discussion of the proposal has exposed a large disagreement between participants who worried about weakened security and others who worried about weakened marketability.
What is post-quantum cryptography?
In 1994, Peter Shor developed Shor’s algorithm, which can use a quantum computer to factor large numbers asymptotically faster (i.e. faster by a proportion that grows as the size of the input does) than a classical computer can. This was a huge blow to the theoretical security of the then-common RSA public-key encryption algorithm, which depends on the factoring of numbers being hard in order to guarantee security. Later work extended Shor’s algorithm to apply to other key-exchange algorithms, such as elliptic-curve Diffie-Hellman, the most common key-exchange algorithm on the modern internet. There are doubts that any attack using a quantum computer could actually be made practical — but given that the field of cryptography moves slowly, it could still be worth getting ahead of the curve.
Quantum computing is sometimes explained as trying all possible answers to a problem at once, but that is incorrect. If that were the case, quantum computers could trivially break any possible encryption algorithm. Instead, quantum computers work by applying a limited set of transformations to a quantum state that can be thought of as a high-dimensional unit-length vector. The beauty of Shor’s algorithm is that he showed how to use these extremely limited operations to reliably factor numbers.
The study of post-quantum cryptography is about finding an encryption mechanism that none of the generalizations of Shor’s algorithm or related quantum algorithms apply to: finding encryption techniques where there is no known way for a quantum computer to break them meaningfully faster than a classical computer can. While attackers may not be breaking encryption with quantum computers today, the worry is that they could use a "store now, decrypt later" attack to break today’s cryptography with the theoretically much more capable quantum computers of tomorrow.
For TLS, the question is specifically how to make a post-quantum key-exchange mechanism. When a TLS connection is established, the server and client use public-key cryptography to agree on a shared encryption key without leaking that key to any eavesdroppers. Then they can use that shared key with (much less computationally expensive) symmetric encryption to secure the rest of the connection. Current symmetric encryption schemes are almost certainly not vulnerable to attack by quantum computers because of their radically different design, so the only part of TLS’s security that needs to upgrade to avoid attacks from a quantum computer is the key-exchange mechanism.
Belt and suspenders
The problem, of course, is that trying to come up with novel, hard mathematical problems that can be used as the basis of an encryption scheme does not always work. Sometimes, cryptographers will pose a problem believing it to be sufficiently hard, and then a mathematician will come along and discover a new approach that makes attacking the problem feasible. That is exactly what happened to the SIKE protocol in 2022. Even when a cryptosystem is not completely broken, a particular implementation can still suffer from side-channel attacks or other problematic behaviors, as happened with post-quantum encryption standard Kyber/ML-KEM multiple times from its initial draft in 2017 to the present.
That’s why, when the US National Institute of Standards and Technology (NIST) standardized Kyber/ML-KEM as its recommended post-quantum key-exchange mechanism in August 2024, it provided approved ways to combine a traditional key-exchange mechanism with a post-quantum key-exchange mechanism. When these algorithms are properly combined (which is not too difficult, although cryptographic implementations always require some care), the result is a hybrid scheme that remains secure so long as either one of its components remains secure.
The Linux Foundation’s Open Quantum Safe project, which provides open-source implementations of post-quantum cryptography, fully supports this kind of hybrid scheme. The IETF’s initial draft recommendation in 2023 for how to use post-quantum cryptography in TLS specifically said that TLS should use this kind of hybrid approach:
The migration to [post-quantum cryptography] is unique in the history of modern digital cryptography in that neither the traditional algorithms nor the post-quantum algorithms are fully trusted to protect data for the required data lifetimes. The traditional algorithms, such as RSA and elliptic curve, will fall to quantum cryptanalysis, while the post-quantum algorithms face uncertainty about the underlying mathematics, compliance issues (when certified implementations will be commercially available), unknown vulnerabilities, hardware and software implementations that have not had sufficient maturing time to rule out classical cryptanalytic attacks and implementation bugs.
During the transition from traditional to post-quantum algorithms, there is a desire or a requirement for protocols that use both algorithm types. The primary goal of a hybrid key exchange mechanism is to facilitate the establishment of a shared secret which remains secure as long as as one of the component key exchange mechanisms remains unbroken.
But the most recent draft from September 2025, which was ultimately adopted as a working-group document, relaxes that requirement, noting:
However, Pure PQC Key Exchange may be required for specific deployments with regulatory or compliance mandates that necessitate the exclusive use of post-quantum cryptography. Examples include sectors governed by stringent cryptographic standards.
This refers to the US National Security Agency (NSA) requirements for products purchased by the US government. The requirements "will effectively deprecate the use of RSA, Diffie-Hellman (DH), and elliptic curve cryptography (ECDH and ECDSA) when mandated." The NSA has a history of publicly endorsing weak (plausibly already broken, internally) cryptography in order to make its job — monitoring internet communications — easier. If the draft were to become an internet standard, the fact that it optionally permits the use of non-hybrid post-quantum cryptography might make some people feel that such cryptography is safe, when that is not the current academic consensus.
There are other arguments for allowing non-hybrid post-quantum encryption — mostly boiling down to the implementation and performance costs of supporting a more complex scheme. But when Firefox, Chrome, and the Open Quantum Safe project all already support and use hybrid post-quantum encryption, that motivation didn’t ring true for other IETF participants.
Some proponents of the change argued that supporting non-hybrid post-quantum encryption would be simpler, since a non-hybrid encryption scheme would be simpler than a hybrid one. Opponents said that was focusing on the wrong kind of simplicity; adding another method of encryption to TLS makes implementations more complex, not less. They also pointed to the cost of modern elliptic-curve cryptography as being so much smaller than the cost of post-quantum cryptography that using both would not have a major impact on the performance of TLS.
From substance to process
The disagreement came to a head when Sean Turner, one of the chairs of the IETF working group discussing the topic, declared in March 2025 that consensus had been reached and the proposal ought to move to the next phase of standardization: adoption as a working-group document. Once a draft document is adopted, it enters a phase of editing by the members of the working group to ensure that it is clearly written and technically accurate, before being sent to the Internet Engineering Steering Group (IESG) to possibly become an internet standard.
Turner’s decision to adopt the draft came as a surprise to some of the participants in the discussion, such as Daniel J. Bernstein, who strongly disagreed with weakening the requirements for TLS 1.3 to allow non-hybrid key-exchange mechanisms and had repeatedly said as much. The IETF operates on a consensus model where, in theory, objections raised on the mailing list need to be responded to and either refuted or used to improve the standard under discussion.
In practice, the other 23 participants in the discussion acknowledged the concerns of the six people who objected to the inclusion of non-hybrid post-quantum key-exchange mechanisms in the standard. The group that wanted to see the draft accepted just disagreed that it was an important weakening in the face of regulatory and maintenance concerns, and wanted to adopt the standard as written anyway.
From there, the discussion turned on the question of whether the working-group charter allowed for adopting a draft that reduced the security of TLS in this context. That question never reached a consensus either. After repeated appeals from Bernstein over the next several months, the IESG, which handles the IETF’s internal policies and procedures, asked Paul Wouters and Deb Cooley, the IETF’s area directors responsible for the TLS working group, whether Turner’s declaration of consensus had been made correctly.
Wouters declared that Turner had made the right call, based on the state of the discussion at the time. He pointed out that while the draft permits TLS to use non-hybrid post-quantum key-exchange algorithms, it doesn’t recommend them: the recommendation remains to use the hybrid versions where possible. He also noted that the many voices calling for adoption indicated that there was a market segment being served by the ability to use non-hybrid algorithms.
A few days after Wouters’s response, on November 5, Turner called for last objections to adopting the draft as a working-group document. Employees of the NSA, the United Kingdom’s Government Communications Headquarters (GCHQ), and Canada’s Communications Security Establishment Canada (CSEC) all wrote in with their support, as did employees of several companies working on US military contracts. Quynh Dang, an employee of NIST, also supported publication as a working-group document, although claimed not to represent NIST in this matter. Among others, Stephen Farrell disagreed, calling for the standard to at least add language addressing the fact that security experts in the working group thought that the hybrid approach was more secure: "Absent that, I think producing an RFC based on this draft provides a misleading signal to the community."
As it stands now, the working group has adopted the draft that allows for non-hybrid post-quantum key-exchange mechanisms to be used in TLS. According to the IETF process, the draft will now be edited by the working-group members for clarity and technical accuracy, before being presented to the IESG for approval as an internet standard. At that point, companies wishing to sell their devices and applications to the US government will certainly enable the use of these less-secure mechanisms — and be able to truthfully advertise their products as meeting NIST, NSA, and IETF standards for security.
[ Thanks to Thomas Dalichow for bringing this topic to our attention. ]