Lattice Blog

Share:

[Blog] Making Post Quantum Cryptography Real with FPGAs

1024 Bob O blog
Posted 10/24/2025 by Bob O’Donnell, president and chief analyst of TECHnalysis Research, LLC

Posted in

One of the more exciting developments now happening in the high-tech world is the work being done to enable quantum computing. After decades of theoretical discussion and development, the last few years have shown tangible progress in this radically different (and enormously complex) new method of computing. Quantum computers essentially perform calculations by flipping the electrical charge of individual atoms and allowing them to simultaneously exist in more than one state through a process called entanglement. The manner and speed at which they work allows them to solve extremely sophisticated problems that would take traditional computers years or even centuries of calculations.

As exciting and powerful as these quantum computers can be, however, they’ve also raised some concerning questions in certain fields, particularly that of cryptography. In modern computing systems, cryptography involves using sophisticated mathematical algorithms that essentially scramble the data being processed. In order to be able to read and use that data, you need a digital key to unlock it. The challenge raised by quantum computing is that both the algorithms that were originally created to scramble and encrypt the data and, even more importantly, the mechanisms by which digital keys are created and exchanged can potentially be broken with quantum computers. This, in turn, would allow bad actors to unscramble the encrypted data and get access to it.

Needless to say, that’s a big problem, particularly because massive troves of data in businesses, governments and other organizations all over the world are encrypted with these older algorithms and use these potentially unsafe public/private key distribution methods. As a result, new methods of encryption and new protocols for key exchange that cannot be broken by quantum computers—collectively referred to as post-quantum cryptography (PQC)—have been created to keep that data safe.

The exact timeframe when quantum computers can actually perform these actions is still considered to be several years in the future (2030-2035). However, a more pressing problem is that many bad actors are already starting to capture encrypted data in the hopes they’ll be able to read it in a few years when quantum computers do break the current encryption and key exchange methods. This concept—referred to as Harvest Now, Decrypt Later, or HNDL—is particularly concerning for data that has a long shelf life, such as classified government or military information, account numbers and much more.

Because of these concerns, most major governments either have or are in the process of developing new algorithms, processes and requirements to try and mitigate the potential impact of these issues. The US government’s NSA (National Security Agency), for example, has a set of policies referred to as Commercial National Security Algorithm (CNSA) 2.0 and the National Institute of Standards (NIST) has created the FIPS 202/203 standards to address these issues.

These widely adopted guidelines incorporate a variety of quantum-safe encryption algorithms as well as offering support for technologies such as lattice and hash-based key signing mechanisms. In certain key industries and applications, these rules are expected to be in place by the end of this year, with other major milestones scheduled for 2027.

But just having the policies and algorithms isn’t enough. They need to be implemented on computing devices and within software platforms to ensure they’re capable of running post-quantum cryptography. While that may sound relatively straightforward, it actually requires a multi-layered security approach that touches everything from device attestation—essentially announcing what the device does and what it’s capable of—through firmware update security requirements, a hardware root of trust to hold the cryptographic keys, and all the way down to manufacturing chips in a secure environment.

In other words, while computing devices need to be updated to incorporate chips capable of reading these new cryptographic algorithms to allow access to the encrypted data and perform the new methods of key exchange, it takes more than that to fully enable PQC. A critical part of the key exchange process, for example, is ensuring that each device in the computing chain is considered trustworthy. That, in turn, requires firmware-level embedded software that can “attest” to who they are through a unique digital ID that gets embedded into a control chip when it is first manufactured.

Typically, the encryption/decryption process occurs when a device first boots, which means the first chip that turns on and starts the process plays an essential (though brief) role in what happens. That chip basically “announces” itself to the device’s BIOS/firmware who it is and passes along that info to the operating system. Then, when any keys need to be exchanged as part of the encryption/decryption process, that first chip provides those keys and its unique ID to assure the machine on the other side of the connection that it is safe to work with. Of course, the reality is more complex than that, but that’s the kind of operation that occurs. It also clarifies why it’s so critical that the first chip in the chain remains uncompromised from its original state—it’s the base from which an entire sequence of security and encryption-related actions occur.

It also explains the fact that any necessary firmware updates or changes to that base chip need to be highly secure, otherwise the entire security chain gets broken down. Because some of the key cryptographic algorithms and key exchange mechanisms are also stored in the base chip (and occasionally need to be updated as changes/improvements to them are made), the mechanisms by which those updates are made often use encryption as well, further reinforcing the high degree of security required.

Achieving all of these capabilities in a single chip can be challenging, especially because, rather than being a primary processor, it needs to be done via a small control chip that functions in a low power envelope. That’s where Lattice Semiconductor’s MachXO5™-NX TDQ comes in. Packed into a tiny package, MachXO5-NX TDQ incorporates a unique device ID created through technologies such as Device Identifier Composition Engine (DICE) and Security Protocol and Data Model (SPDM), as well as two independent blocks of flash memory to allow firmware upgrades and secure boot. In addition, the chip incorporates both traditional and post-quantum cryptographic algorithms and the flexibility to host use both types, for bitstream and data security, as organizations make the switch between the two. Finally, it also includes PQC-based encryption and verification methods to ensure that the critical key exchanges can also be made in a quantum-safe manner, effectively blunting the HNDL threat.

Bob O’Donnell is the president and chief analyst of TECHnalysis Research, LLC a market research firm that provides strategic consulting and market research services to the technology industry and professional financial community. You can follow him on X @bobodtech.

 

Share: