Explore why PQC adoption lags in Europe, the real blockers, and how to achieve quantum-safe security.
Quantum computing has long been the subject of both hype and genuine concern in security circles. With the recent release of NIST’s first post-quantum cryptography (PQC) standards, the conversation has shifted from “if” to “when”—and, more urgently, “how.” Yet, for all the talk of “Q Day” and the need to be quantum-ready, the reality is that very few organizations, especially in Europe, are making concrete moves toward a quantum-safe future.
So, what’s really holding us back? And what does meaningful preparation actually look like for organizations entrusted with digital trust at scale?
As a PKI and CLM provider working closely with European enterprises and critical infrastructure, Evertrust sees a stark gap between the existence of standards and real-world adoption. Here’s our view on what’s holding back the transition, and what needs to happen next.
NIST’s release of FIPS 203, 204, and 205 (based on Kyber, Dilithium, and SPHINCS+) was a watershed moment. For the first time, we have a clear technical direction for quantum-resistant encryption and signatures.
But as anyone who’s ever managed a PKI knows, standards are only the beginning.
The most critical blocker to Post-Quantum Cryptography (PQC) migration remains the lack of PQC-ready Hardware Security Modules (HSMs) and Key Management Services (KMSs).
Indeed, HSMs and KMSs are the trust anchors for PKI, code signing, digital identity, and increasingly, cloud-native applications. They perform secure key generation, storage, signing, and policy enforcement.
Well, as of mid-2025, no major HSM vendor offers general availability support for NIST PQC algorithms (ML-KEM, ML-DSA, SLH-DSA). The same goes to the major cloud KMS providers (AWS, Azure, Google Cloud) who have only just started to roll out PQC primitives in preview or limited beta, mostly for experimental use.
On-premises KMS appliances are in a similar position, with PQC support still in early development or pilot phases.
What's causing these infrastructure delays? Here are the key technical barriers :
The result is therefore clear : whether organizations rely on HSMs, KMSs, or a hybrid approach, the lack of mature, certified PQC support in these foundational components is a hard blocker for any meaningful migration.
Cryptography goes beyond digital certificates. We have to think about every protocol and application that uses asymmetric keys in general. It’s not enough for a single product or cloud service to “support PQC.” True post-quantum readiness requires a holistic transformation of the entire cryptographic ecosystem.
This means updating and coordinating quantum-safe algorithms across all layers—cryptographic protocols (TLS, SSH, VPNs), applications, hardware security modules, identity and access management systems, and key lifecycle processes.
Each component must interoperate seamlessly to maintain security and functionality. Achieving this level of integration demands extensive planning, rigorous testing, and collaboration across vendors and standards bodies. Without this comprehensive approach, isolated PQC support risks creating security gaps and operational disruptions.
For example, TLS 1.3, SSH, S/MIME, IPsec, and even proprietary protocols must be updated to support PQC. Hybrid key exchange (classical + PQC) is also needed for backward compatibility, but not all libraries or stacks support it. And finally, PQC handshake messages are larger, which can break assumptions in legacy code or network appliances.
In terms of applications, many enterprise ones have hardcoded assumptions about key sizes, certificate formats, and cryptographic providers. Updating these requires vendor support, regression testing, and sometimes architectural changes.
Without forgetting about interoperability, post-quantum cryptographic algorithms are new and not uniformly supported across stacks (e.g., OpenSSL, BouncyCastle, etc.).
Certificate chains with hybrid or PQC signatures may not validate correctly in all clients.
So the conclusion is that even if you can issue PQC certificates, most of your infrastructure cannot yet consume or validate them end-to-end.
In Europe, especially, many organizations are waiting for clear migration roadmaps and compliance requirements from regulators.
The upcoming eIDAS 2.0 and ETSI standards will help, but for now, many CISOs are in “wait and see” mode.
The European Commission took a major step in April 2024 by publishing a Recommendation for a coordinated roadmap on PQC. This initiative encourages EU Member States to harmonize their approaches and even proposes the creation of a dedicated sub-group within the NIS Cooperation Group to oversee the transition.
However, unlike the U.S. National Security Agency’s CNSA 2.0 policy, which sets phased, binding deadlines for phasing out vulnerable algorithms—Europe’s roadmap does not impose concrete migration dates. The result is a landscape where the direction is clear, but the pace and urgency remain uncertain.
Sector-specific regulations are also evolving. The Digital Operational Resilience Act (DORA), which is now in force, sets out requirements for cryptography management in the financial sector. Yet, DORA stops short of mandating when or how PQC must be adopted.
Meanwhile, Europol and the European Commission have issued urgent calls for coordinated action, emphasizing the need for cross-sectoral planning and pilot projects. These signals are important, but they do not yet translate into enforceable obligations for enterprises.
Funding and support for PQC research and testing are ramping up. The EU Cybersecurity Centre (ECCC) has allocated €25 million to build a European testing infrastructure for PQC, with more funding earmarked for cyber resilience projects through 2026. These investments are designed to help bridge the gap between research and market deployment, but they are still in the early stages. Concrete results—such as certified, widely available PQC solutions—will take time to materialize.
Standardization is another area where progress is visible but incomplete. The European Commission is working with European Standardisation Organisations to integrate NIST-approved PQC algorithms into EU cryptographic protocols. Projects like PQC4eMRTD, focused on quantum-safe electronic passports, are underway, but broader harmonization and finalization of standards for all critical sectors is still a work in progress.
Industry and policy groups are also mobilizing. For example, CEPS and other think tanks have launched task forces to accelerate the EU’s transition, raising important questions about whether Europe is falling behind and calling for explicit migration deadlines and stronger mandates. Despite these efforts, most enterprise adoption is still driven by voluntary best practices rather than hard compliance requirements.
One of the most pressing challenges is the sheer complexity of key and certificate lifecycle management in a post-quantum world.
PQC key pairs are typically much larger than their classical counterparts, which impacts storage, backup, and performance. The recommended lifetimes for these keys may also be shorter, given that the algorithms themselves are newer and may evolve as cryptanalysis advances. This means organizations will need to rotate and renew keys more frequently, increasing the operational burden on already stretched security teams.
Traditional processes for key escrow, backup, and recovery are often not designed for the unique requirements of PQC keys. For example, larger key sizes can strain existing hardware and software, and legacy systems may not be able to handle new formats or workflows.
Automated certificate discovery and renewal tools, which are essential for maintaining visibility and compliance in large environments, must be updated to recognize and support both PQC and hybrid certificates. Without these updates, organizations risk losing track of critical assets, missing renewal deadlines, or inadvertently deploying weak cryptography.
Monitoring and compliance add another layer of complexity. Security monitoring tools and compliance frameworks are still catching up to PQC. Many are not yet able to accurately identify, audit, or report on the use of new algorithms or hybrid deployments. This lack of visibility increases the risk of misconfiguration, non-compliance, or undetected vulnerabilities during the transition period.
All of these factors contribute to a significant increase in operational risk if the migration is not tightly managed and monitored. The reality is that most organizations today lack the integrated tools and mature processes needed to orchestrate a smooth, secure, and auditable transition to PQC at scale.
Recent research from Google Quantum AI has dramatically lowered the estimated resources needed to break 2048-bit RSA encryption with a quantum computer. As of May 2025, it’s now believed that a quantum computer with just 1 million noisy qubits running for one week could break RSA-2048—a 20-fold reduction in the qubit count from estimates just six years ago. This rapid progress is thanks to both algorithmic breakthroughs and advances in error correction.
While today’s quantum computers are still far from this scale, the pace of improvement is clear—and so is the risk. As Google’s researchers point out, “store now, decrypt later” attacks are already a reality: adversaries can collect encrypted data today and decrypt it once quantum computers are ready. This makes the case for urgent migration to PQC even stronger, especially for long-lived keys and sensitive data.
NIST’s own guidance now recommends deprecating vulnerable systems after 2030 and disallowing them entirely after 2035. The window for safe transition is closing fast.
Preparing for the post-quantum cryptography era is not about rushing into a massive, disruptive overhaul. Instead, it’s a strategic, phased journey; one that requires building crypto-agility and laying a strong foundation for a smooth transition, aligned with your organization’s unique risk profile and operational realities.
Here’s how we see the path forward:
1) Inventory and visibility : know what you have before you move
Too often, organizations underestimate the complexity of their cryptographic footprint. Real preparation starts with comprehensive visibility. This means mapping every certificate, key, and cryptographic usage point across your entire environment—whether it’s PKI, TLS endpoints, code signing, VPNs, SSH, or embedded devices.
Without this granular inventory, you’re flying blind. You can’t protect what you don’t know exists. Understanding the types of algorithms in use, key lengths, expiration timelines, and dependencies is critical to prioritizing your migration efforts and avoiding operational surprises.
2) Centralization and automation : scaling security with confidence
Manual, spreadsheet-based certificate management simply doesn’t scale—especially when you consider the increased complexity PQC introduces. Centralizing control through a robust certificate lifecycle management (CLM) platform is essential.
Automation is the key to reducing human error and operational overhead. Automated discovery, renewal, revocation, and bulk replacement of certificates enable organizations to respond swiftly when PQC migration timelines accelerate. This centralized approach also ensures consistent policy enforcement and compliance visibility across hybrid and multi-cloud environments.
3) Start testing PQC : early adoption mitigates future risks
Waiting until PQC is “fully ready” is a risky gamble. Early testing of PQC algorithms and hybrid certificates in controlled environments provides invaluable insights. It helps identify compatibility issues with legacy systems, measures performance impacts, and allows your teams to develop the necessary skills and processes.
Interoperability challenges are inevitable given the novelty of PQC standards and uneven vendor support. Starting pilots now means you can work through these challenges gradually, rather than facing urgent, large-scale disruptions later.
In our view, successful PQC preparation is a balance of awareness, automation, and proactive experimentation. Organizations that invest in these areas today will be positioned not just to survive the quantum transition—but to thrive in a future where crypto-agility is a core business capability.
At Evertrust, we’re committed to making that journey as seamless as possible—so when the time comes, you’re ready to move, not scramble.
Curious about what crypto-agility looks like in practice?
Want to see PQC and hybrid certificates in action?
Let’s talk.