Lattice Blog

Share:

[Blog] Building a Strong and Flexible Foundation for Post-Quantum Security

627 Sec Seminar Recap blog
Posted 06/27/2025 by Eric Sivertson, Mamta Gupta

Posted in

Recent advancements in quantum computing have made post-quantum cryptography (PQC) more necessary than ever before. There is an immediate need for developers across industries to strengthen their computing ecosystem against the heightened risk and uncertain capabilities of quantum-enabled attacks. The challenge? There’s not yet a standard, comprehensive model for ensuring post-quantum security.

Traditionally, developers have been able to create standards and best practices in response to shared experiences. But with quantum capabilities evolving so quickly, they can no longer afford this luxury, and must find ways to secure against quantum risks without taking gambles on the long-term feasibility of their security infrastructure. rel="noopener noreferrer"

In our latest Security Seminar, security experts from Lattice, PQShield, Quside, and Secure-IC discussed evolving PQC requirements and the importance of taking a hardware-software co-design approach to meeting these security needs.

What’s Driving Evolving PQC Guidance?
As the quantum landscape continues to take shape, it has spurred the development of various standards and guidelines for PQC. The most popular of these is arguably the Commercial National Security Algorithm Suite 2.0 (CNSA 2.0), a U.S.-based NSA directive that mandates the use of more powerful PQC algorithms like Kyber, Dilithium, LMS, and XMSS.

Although CNSA 2.0 is the first concrete guidepost for PQC standards, it is not an all-encompassing framework. It’s relatively conservative in scope and fails to account for:

  • The application of increasingly popular algorithms like Falcon or Hamming Quasi-Cyclic (HQC) encryption. Leveraging this kind of cryptographic diversity—where certain algorithms are used for more targeted purposes—can help developers avoid creating a single point of failure in their security infrastructure.
  • The increasing regional divergence in PQC standards. While the NSA’s model is well-researched, it’s not going to be a perfect fit for every country and nation-state. The European Union has developed the Cyber Resilience Act (CRA) with PQC implications, China is deep in its own algorithmic research, and various other countries are following tenets set forth by the National Institute for Standards and Technology (NIST). As each of these regions develops its own regulatory requirements and best practices, multinational organizations will need to ensure regional compliance.

Building an Agile Foundation for PQC
Given how many elements of PQC development are still in flux, it’s challenging to take the leap into building an agile PQC infrastructure. Developers know they need to address quantum threats but want to avoid locking in an algorithm that may be outdated or noncompliant within the year.

To achieve immediate security benefits while ensuring future flexibility, developers must build the following features into their PQC infrastructure:

Crypto-Agility
The first key feature of a resilient PQC model is cryptographic agility, or crypto-agility. This is the ability to switch between different cryptographic algorithms on the fly without disrupting or otherwise negatively affecting core operations and enabling in-field updates of newer algorithms and protocols.

As the field of cryptographic algorithms continues to grow and adapt to quantum threats, throwing the weight of their entire PQC infrastructure behind one specific algorithm can paint developers into a corner. Instead, building a system that can support various algorithm types—as well as hybrid models that balance current cryptographic capabilities with future-oriented PQC models—is critical.


Upgradability at Scale
Stemming from crypto-agility is the need for secure upgradability. If standards shift or a new effective algorithm enters the conversation, developers must know that their infrastructure can handle upgrades at scale without compromising security or performance.

This necessitates the use of dynamic hardware that is robust enough to function as intended while maintaining flexibility to support new software as needed. Rather than hard-coding specific algorithmic capabilities, developers can create a full stack rooted in trust and capable of change.


High-Quality Entropy
Often overlooked in the PQC conversation, entropy is foundational to any post-quantum approach to security. Entropy is the level of unpredictability or randomness of the data generated for things like key or random number generation.

The quality of this entropy has a direct impact on security; if encryption keys become at all predictable, they are that much easier for quantum computers to crack. This feature is so important that international regulatory bodies are beginning to include checks on the quality and security of random number generation in their compliance audits. Developers must ensure their entropy is reliable, secure, and fully traceable. 

Hardware-Software Co-design for Future-Proof PQC

With such an algorithmic focus, PQC is often viewed as a software-based challenge for developers to solve. However, meeting these requirements is not the job of software alone. The best way to create agile and future-proof PQC solutions is by taking a hardware-software co-design approach.

This starts with understanding the hardware best suited to support agile, reliable, and field-upgradable quantum capabilities. One such solution is a Quantum Random Number Generator (QRNG), a device that leverages the inherent random behavior of photons or other subatomic particles to create truly unpredictable number sequences. QRNGs turn the potential threat of quantum capabilities into a protective solution, producing high-quality entropy while maintaining proof of origin and verifiability. This kind of efficient random number generation is key for PQC operations at scale, producing randomized encryption keys without sacrificing performance or bandwidth.

Beyond QRNGs, developers should leverage Field Programmable Gate Arrays (FPGAs) in their PQC infrastructure. With a dedicated digital signal processing (DSP) block that can be programmed to support computationally intensive algorithms, FPGAs can serve as a coprocessor with other System on Chip (SoC) devices, acting as a full stack trust anchor while enabling more flexible algorithmic implementation in the middle and regional/hybrid applications at the top. These devices are also inherently field upgradable, enabling the crypto-agility and flexibility at scale required for a future-proof PQC model.

Staying Ready for Future Quantum Threats
PQC isn’t coming; it’s here. Developers can’t wait for standards and regulations to catch up. They need to build crypto-agile, entropy-assured, regionally flexible systems today. QRNGs and FPGAs are uniquely suited to handle the creation of securely updatable crypto engines that can be programmed for regional requirements and other unique needs.

You can view the full Security Seminar here. To learn more about enabling operational consistency and compliance flexibility for PQC applications, contact our team today.

 

Share: