Building a Quantum-Safe Internet: The IETF's Plan for TLS

Rich Salz

Written by

Rich Salz

June 18, 2025

Rich Salz

Written by

Rich Salz

Rich Salz is a Principal Architect in Architecture & Technology Strategy at Akamai. He has been involved in the definition and implementation of internet and security standards for more than 30 years, actively with the IETF and the QuicTLS toolkit. At Akamai, he works on making systems and customers more secure by default.

The responsibility for defining how these new standards should be adopted and used in protocols like TLS falls to the IETF.
The responsibility for defining how these new standards should be adopted and used in protocols like TLS falls to the IETF.

Thanks to Akamai Chief Architect Jan Schaumann and Akamai Senior Product Manager Tim Daffron for their contributions to this post.

Contents

Transport Layer Security (TLS) is the protocol responsible for securing most of the sensitive data transmitted over the internet — from online banking to private messages. The invention of a powerful quantum computer would put that sensitive information at risk.

Organizations like the U.S. National Institute of Standards and Technology (NIST) and the Internet Engineering Task Force (IETF) are working to modernize encryption methods and bolster data protection in a post-quantum world. 

In this blog post, we’ll explore these organizations’ recent initiatives and give an update on Akamai’s three-phased approach to support end-to-end post-quantum cryptography on our content delivery platform. 

A glimpse into the future: Post-quantum cryptography

Today, TLS relies heavily on cryptosystems such as RSA, which are based on complex mathematical problems that are difficult for classical computers to solve. However, when a large-scale quantum computer is built, it could use powerful quantum algorithms like Shor’s algorithm to solve these problems quickly — undermining the security of TLS and posing a major cybersecurity threat.

Post-quantum cryptography (PQC), or quantum-resistant cryptography, is a field of cryptography that seeks to mitigate quantum threats by developing cryptographic systems that would remain secure even if a powerful quantum computer were unleashed on modern encryption. NIST is at the forefront of this field.

NIST’s post-quantum cryptography standards

NIST plays an important role in setting standards across a wide number of technology-oriented industries — from healthcare to transportation to information technology. In the computing world, NIST is widely known for its leadership in defining important cryptographic standards. 

Recently, the organization completed a multi-year global effort to standardize new algorithms that are designed to stay secure even after the arrival of cryptographically relevant quantum computers (CRQC).

The U.S. Department of Defense has stated that it plans to follow the NIST timeline for migrating to post-quantum cryptography standards no later than 2035. As of this writing, many other countries, including Canada, Sweden, Germany, and the United Kingdom, have adopted a similar timetable. But government standards are not necessarily enough to move the industry forward.

Although the standardization of new post-quantum encryption algorithms is a milestone, it’s just the beginning. These cryptographic algorithms must still be integrated into real systems across public and private sectors before we can consider the global internet to be quantum-safe.

Building a quantum-safe internet

The responsibility for defining how these new standards should be adopted and used in protocols like TLS falls to the IETF. Let’s take a closer look at the organization’s recent work in this area.

Understanding the key exchange

Most public key cryptography follows the same two-part pattern. First, you use a complex (and computationally expensive) asymmetric algorithm to set up a shared, symmetric key between the two parties. This step is called the key exchange, or key establishment. Next, you use that key to encrypt and decrypt the bulk of the data symmetrically. 

This two-part mechanism works well for most use cases. For example, if you encrypt a file or email, you can later add more recipients just by sharing the bulk encryption key; that is, without having to encrypt the entire message again.

Securing the key exchange with hybrid algorithms

In our August 2024 blog post, Taking Steps to Prepare for Quantum Advantage, we discussed the immediate need to secure the TLS key exchange and protect against "harvest now, decrypt later" attacks. In this type of attack, the attackers obtain encrypted data today in hopes of decrypting it later with the power of quantum computers. Days after that post was published, NIST published their initial standards for PQC to defend against this threat.

Since the PQC standard publication, the IETF has created several working groups that are currently defining how to use post-quantum algorithms in the existing standards framework. So far, the IETF has mostly prioritized hybrid algorithms, which mix a classic algorithm with a PQ one. This belt-and-suspenders approach is favored by many experts because PQ algorithms are still new and haven't had the same analysis as many classic algorithms.

One of the most important uses of these algorithms is in the TLS key exchange. Although an early draft of the guidance showed how to use NIST’s proposed Kyber standard, a subsequent draft incorporated the official version: Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM), published as FIPS 203.

The new name describes the mathematical principles of the algorithm, which uses techniques that are believed to be safe against all known quantum computing algorithms. Shortly after the updated draft was released on August 8, 2024, internet browsers moved from implementing the draft Kyber-X25519 hybrid to preferring the X25519MLKEM768 hybrid.

Defining TLS 1.3 as the starting point for post-quantum cryptography

In addition to defining many of the standards that govern how the internet operates today, IETF sent a clear signal that any new advances in cryptography, such as post-quantum support, will use TLS 1.3 as the starting point. This is the most up-to-date and secure version of the internet’s primary protocols for encrypting data.

The IETF will be publishing two documents that will soon be RFCs. (Disclosure: I am the primary author of both.)

  1. The first document (from the applications area) outlines that any new protocol that uses TLS should adhere to TLS 1.3. The internet is not just the World Wide Web. Even though many Akamai customers have custom devices such as web TVs or point-of-sale card readers that cannot be upgraded easily, we still see that more than 80% of the traffic that Akamai delivers uses TLS 1.3.
  2. The second document (from the TLS working group) states that, barring exceptional security issues, TLS 1.2 is frozen. Most notably, post-quantum cryptography is only being added to TLS 1.3 through the existing TLS extension mechanism.

Enhancing security today and preparing for a post-quantum future

Akamai is proud of our contributions to enhancing internet security. We funded the development of TLS 1.3 in OpenSSL, a major upgrade that makes encrypted connections faster and more secure. We were also a founding sponsor of Let’s Encrypt, a nonprofit organization that provides free TLS certificates to websites. 

By making it easier and more affordable for websites to enable HTTPS encryption, the organization has significantly increased the use of TLS. These efforts, among others, help strengthen the overall security of the internet and lay the foundation for post-quantum security adoption. 

Akamai’s phased approach: An update

In our previous post, we outlined Akamai’s three-phased approach to securing end-to-end delivery transport, including: 

  • Phase one: Akamai-to-origin (Akamai-to-website)
  • Phase two: Client-to-Akamai (browser-to-Akamai)
  • Phase three: Akamai-to-Akamai 

The latest tactical updates on our progress include:

Phase one: Akamai-to-origin

We’re excited to announce a limited availability feature launch of this Akamai-to-origin service on June 30, 2025. We are code complete and finalizing our remaining tasks. Akamai delivery customers will be able to opt in through their account teams, exposing TLS 1.3 plus PQC (an ML-KEM–based hybrid) as a part of the handshake.

Customer origins will select the highest handshake match, similarly to how our platform behaves today. The feature will be available to all Akamai Ion and delivery customers (on enhanced TLS) at no additional cost.

Phase two: Client-to-Akamai

We’re equally excited to announce the limited availability feature launch of this client browser-to-Akamai service on September 1, 2025. Similar to our Akamai-to-origin approach, customers will be able to opt in and have the additional TLS plus PQC handshake option presented to client browsers. 

The feature will be available to all Akamai Ion and Akamai Dynamic Site Accelerator customers (on enhanced TLS) at no additional cost.

Phase three: Akamai-to-Akamai

We’re planning to launch this Akamai-to-Akamai feature in late 2025. Today, all Akamai mid-tier traffic is compliant with FIPS 140 through our secure edge architecture. We’re currently developing a solution that will secure all mid-tier traffic by making it both FIPS 140 compliant and quantum resistant. 

Our final design for this feature is complete, and end-to-end planning activities are underway. We will provide a timeline update with our commitment date in a future post.

Stay tuned

Akamai is committed to safeguarding customer data and staying ahead of quantum computing threats. We’ll continue to share regular updates on our PQC implementation process via more blog posts in this series. 

Our next post will cover recommendations for early adopters of post-quantum cryptography, including which software stacks support these new algorithms and what pitfalls might be expected. Stay tuned!

For more information

For additional information or questions, please contact your account representative.



Rich Salz

Written by

Rich Salz

June 18, 2025

Rich Salz

Written by

Rich Salz

Rich Salz is a Principal Architect in Architecture & Technology Strategy at Akamai. He has been involved in the definition and implementation of internet and security standards for more than 30 years, actively with the IETF and the QuicTLS toolkit. At Akamai, he works on making systems and customers more secure by default.