Application Development Blog Posts
Learn and share on deeper, cross technology development topics such as integration and connectivity, automation, cloud extensibility, developing at scale, and security.
cancel
Showing results for 
Search instead for 
Did you mean: 
anselme_kemgnetueno
Discoverer
Cryptography prevents unauthorized people from reading private messages and therefore helps us protect our digital data, be it in transit, at rest, or even during computation. Using a key, a crypto-schema transforms a clear text into a ciphertext that reveals nothing about the initial clear text. Encryption schemas are secure as long as no one has detected an attack. This situation may evolve over time, or new algorithms may be introduced that are more efficient or offer new features. As such, standards and regulations may require replacing old algorithms or adding new ones in the existing ecosystem, which has historically led to time-consuming and challenging migrations. The solution is crypto-agility, which is about having applications relying on cryptography to be (crypto-)agile. Crypto-agility is the ability of an IT system to rapidly migrate to new cryptographic algorithms and standards in a sustainable and swift way [2]. As with software vulnerabilities, it should be possible to quickly update existing software, when encryption schemas are likely to become obsolete in the foreseeable future. To understand what this means, it is essential to understand what the problem with cryptography is.

 

The basic principles

Cryptography is the last line of defense against data breaches. It designs and analyses encryption and hash algorithms that ensure IT security goals such as confidentiality, integrity, and authentication. When all other security measures have failed, cryptography provides the last barrier that protects company secrets against unauthorized access. There are symmetric (aka shared-key schemas) and asymmetric (aka public-key) schemas. Both have advantages and disadvantages such that in real-world settings they are used in a hybrid way: slower public-key schemas allow keys to be exchanged, over an insecure channel, which are then used for faster symmetric encryption of the communication. Public-key schemas base their security on mathematical problems that are difficult to solve on a computer. Since its invention in the mid-1970s, public-key cryptography has been relying on two problems, namely prime-factorization (see Figure 1), and discrete logarithm (see Figure 2). Even after several decades of cryptanalysis, their security against classical computers has not been significantly affected.


Figure 1: Factorization Problem


 


Figure 2: Discrete Logarithm Problem


 

The crypto-apocalypse problem

In 1994, American mathematician Peter Shor published an efficient quantum algorithm that finds the prime factors or the discrete logarithm of any integer. When quantum computers become powerful enough to run Shor’s algorithm for commonly used key lengths, cryptographic schemas that rely on the difficulty of prime-factorization or discrete-logarithm will be broken. Public-key cryptography, however, is present in a multitude of applications. It has an impact on the Internet, e-mail, key management, secure shell, virtual private networks, distributed ledgers, code signing, and basically any scenario that requires security goals such as confidentiality, authenticity, integrity, etc. Today, the Internet enables 4.1 billion users to visit 2 billion websites and generates $3 trillion in retail activity while relying on public-key cryptographic standards to ensure our privacy and the security of our data [2]. That is, a large-scale quantum computer represents an apocalyptic cybersecurity threat to our IT infrastructure. Current schemas need to be replaced by other schemas not basing their security on integer factorization (Figure 1) and discrete logarithms (Figure 2).

 


Figure 3: Post-Quantum Crypto-Families


 

The NIST Counterattack

As a reaction to the quantum threat on cryptography; in 2016 the National Institute of Standards and Technology (NIST) initiated a process to solicit, evaluate, and standardize quantum-resistant public-key cryptographic algorithms [4]. In its call for proposals, NIST requested schemas that implement one or more of the following functionalities: public-key encryption, key encapsulation mechanisms (KEM), and digital signatures. The cryptography community is now investigating what is called Post-Quantum Cryptography (PQC) [4, 10]. PQC relies on different math problems (see Figure 3) and provides encryption schemas that run on conventional computers and are believed to be secure against attacks from both classical and quantum computers. Such schemas are therefore called quantum-safe, quantum-resistant, or post-quantum. Based on previous standardization processes for Advanced Encryption Standard (AES) or Secure Hash Algorithm (SHA), NIST’s goal is to select quantum-safe schemas to replace current schemas that are vulnerable to quantum attacks.

Bruce Schneir [1], a renowned cryptographer, compares the NIST process to a demolition derby consisting of a few rounds each lasting a few years. Participants put their algorithms into the ring and then beat each other’s schemas in each round. With input from the crypto community, NIST selects the winners that will be standardized. From the 82 initial schemas, NIST selected only four in the third round (on July 2022) and moved four others to the fourth round. The primary winners are KYBER (for cryptographic key exchange) and DILITHIUM (for digital signature). The additional schemas are signature schemas FALCON (for its smaller signature size) and SPHINCS+ (for its different security assumptions). The algorithms moving to the fourth round are all key exchange algorithms: BIKE, Classic McEliece, HQC, and SIKE. The key lesson is that, unlike current schemes, many PQC schemas are still young and need to survive years of cryptanalysis to be considered safe.


Figure 4: NIST PQC Selection by Rounds


 

The post-quantum immaturity

In summary, we went from 82 initial submissions to 69 in the first round, then 26 in the second round, 15 in the third round, and finally 8 schemes (4 selected for standardization and 4 selected for the fourth round) (see Figure 4). This shows that it is difficult to design secure schemes and that PQC is still immature. While the process advanced, many schemes were completely broken. This happens even to Rainbow [9], a third-round finalist, and SIKE [5,6,7,8] a promising scheme announced in the fourth round. Additionally, the security levels of some winners (KYBER and DILITHIUM) were weakened, bringing them below the threshold defined by NIST [1, 3]. It is important to note that most of the attacks were performed using classical computing techniques. This is because, unlike current schemes, PQC schemes rely on more complicated and less well-understood mathematics. How these schemes will resist attacks from a really powerful quantum computer is still unclear.

 

The quantum uncertainty

In the coming years, we will be facing several uncertainties related to the advent of quantum computing. First, whether the selected schemes will survive classical attacks is not clear. Some algorithms can be improved to address some attacks, but so can the attacks too! While Shor’s algorithm theoretically breaks current cryptography, it requires a powerful quantum computer, such technology is still not there yet. The engineering challenge might turn out to be impossible, or a technological leap might make it happen sooner than expected. Our understanding of its capacity is still limited as well. This, however, will change if a universal fault-tolerant quantum computer does become a reality. Then our quantum understanding will improve over time and probably result in new cryptanalytic possibilities. There are probably two ways to address these uncertainties. One way is certainly the diversity among the PQC schemes themselves. That is, we need schemes relying on a diverse set of computational problems. Once we have this, our IT systems must be able to easily exchange cryptographic implementations and crypto-schemas when old schemas are broken.

 


Figure 5: Non-Crypto-agile Application


 


Figure 6: Crypto-agile Application


 

The crypto-agility urgency

NIST is doing a great job to standardize new PQC schemas. This will not be enough as organizations need to integrate those schemas to be prepared for the sudden advent of a large-scale quantum computer. Beyond the threat of quantum computing, attacks using classical computing may also evolve, relying on new algorithmic insights or new computing resources such as the Cloud to make encryption schemes obsolete or key sizes too small. This has happened in the past, requiring time-consuming migrations that could prove tedious for critical systems [2]. Therefore, we need to be prepared for crypto-schemas to get broken and the solution is agility.

Current applications are not crypto-agnostic as algorithms names and parameters are usually hardcoded in source code (see Figure 5). To add or replace algorithms - maybe due to a new standard, algorithm retirement, or cryptanalysis breakthrough - we need to update the source code and rebuild it. This is disruptive as it requires stopping a functioning system to exchange the updated applications. Additionally, PQC schemes have different requirements (size, decryption failure, state), making their integration even more challenging. This is not only inflexible but takes time and might add inadvertent implementation errors. The situation will be even more chaotic in case of a sudden cryptanalysis breakthrough.

The security of our IT systems, therefore, needs to be made robust against unpredictable cryptographic vulnerabilities, revealed implementation flaws, emerging hardware accelerators, and other threats [2]. Figure 6 illustrates a crypto-agile application that does not call crypto-algorithms by their names as in Figure 5. The application is now equipped with a config file and sends crypto requests to a broker which calls the configured algorithm. Integration of new crypto is done by just updating the configuration without stopping and rebuilding the application. Due to the above-mentioned quantum uncertainty crypto-agility is urgently needed. Crypto-agility goes beyond PQC migration as it considers not just the challenges of migrating our current systems to quantum-safe algorithms, but is also the highly recommended long-term solution that will allow us to react easily and quickly to future unpredictable crypto attacks.

 

Acknowledgments

Thank you to the following people for reviewing and improving a previous version of this article: Andrey Hoursanov, Jonas Boehler, Paul McElligott, and Peter Limacher.

 

References

[1] https://www.schneier.com/crypto-gram/archives/2022/0815.html#cg14

[2] https://arxiv.org/ftp/arxiv/papers/1909/1909.07353.pdf

[3] https://marketing.idquantique.com/acton/attachment/11868/f-0587a79f-5592-47fe-9bdf-a3f3e7f7d802/1/-/...

[4] https://csrc.nist.gov/Projects/post-quantum-cryptography

[5] https://eprint.iacr.org/2022/975.pdf

[6] https://research.nccgroup.com/2022/08/08/implementing-the-castryck-decru-sidh-key-recovery-attack-in...

[7] https://www.quantamagazine.org/post-quantum-cryptography-scheme-is-cracked-on-a-laptop-20220824/

[8] https://ellipticnews.wordpress.com/2022/07/31/breaking-supersingular-isogeny-diffie-hellman-sidh/

[9] https://eprint.iacr.org/2022/214

[10] https://radar.tools.sap/en/itonics_technology_tr3/95/details#sort_type=ds_created&sort_order=desc