With the advent of information security, the importance of cryptography has acquired considerable prominence due to the requirement of security mechanisms such as confidentiality, integrity, authenticity, and non-repudiation in modern data communications.
Quantum computing has now emerged as a nightmare for the presently used cryptographic algorithms which include symmetric, asymmetric and hash functions. The resolution of hard mathematics problems seems to be achieved by the recent research on quantum platforms which was impossible with the traditional computing platforms. NIST had started the quantum-safe or quantum-resilient algorithm standardization process a few years ago and there has been prominent development on that subject but the proper finalization and incorporation in the application will definitely take time. This article informs the reader about the consequences of quantum computing and the hybrid approach introduced by NIST.
Need for Post-Quantum Cryptography
The currently being used advanced cryptographic mechanisms (symmetric, asymmetric and hash algorithms) are an effort of around two decades. The whole public key infrastructure which is eventually based on asymmetric cryptography is being used as the backbone of TLS/SSL based e-commerce is also based on this effort. The concept of quantum computing was first introduced in the 1980s but the latest research in quantum computing mandates extraordinary processing power to quantum computers which will be able to solve the tradition unsolved complex mathematics problems such as finding discrete logarithms and factoring of large integers. The formalization of a quantum computer affects the symmetric and hash cryptographic algorithms to such an extent that the cryptographic strength of the algorithms will be reduced to half. Symmetric and hash algorithms are available in various key sizes, so the immediate and effective solution for this scenario is to use higher/stronger key lengths. But the case is totally different with asymmetric algorithms because asymmetric algorithms are based on hard/complex math problems which will be solved by quantum computers.
Post-Quantum Cryptography Standardization
Post-Quantum Cryptography (PQC) standardization deals with the development/finalization of post-quantum algorithms which will be deployed on the currently used classical computing platforms and would definitely resistant to attacks by quantum computers. NIST had issued a general public notice for the standardization of post-quantum algorithms in the following domains:
- Encryption Algorithms
- Digital signature schemes
- Key establishment schemes
69 candidates registered for standardization in Round 1 of the NIST PQC standardization process. In Round 2, many candidates were eliminated and just 26 candidates were published. Currently, no algorithms have been finalized/standardized as “quantum-resistant” or “quantum-safe” and the finalization process will take time because there are various critical factors for a final decision such as:
- Speed/Efficiency
- Security testing and Cryptanalysis
- Interoperability & Usability with existing protocols/standards.
Every post-quantum candidate has some advantages and disadvantages. There has not been adequate research carried on some quantum algorithms which ensures confidence on the respective schemes. There exists a chance that more than one algorithm will be standardized but the whole process will take time for finalization.
The Hybrid Approach
The hybrid mode/approach can be stated as the evolution toward PQC which provisions the use of classic standardized cryptographic algorithm combined with a post-quantum algorithm.
This secures compliance to specific standards and regulations depending on the area of deployment (e.g. PCI DSS in the context of banking and card based transactions).
It also includes redundantly those algorithms which are - to our best knowledge as of today PQC-safe.
But: We cannot deploy them alone, as they are not standardized by the relevant regulatory boards. And in addition, they are just candidates - with a high likelihood to be PQC-safe, but with no certainty. We simply miss the proof of concept - of course it is missing, as post-quantum computers are not there yet.
Hence this hybrid mode directly mandates the road towards crypto agility.
In the "Report on Post-Quantum Cryptography" , NIST points out right in the beginning that it "recognizes the challenge of moving to new cryptographic infrastructures and therefore emphasizes the need for agencies to focus on crypto agility."
Crypto-Agilty as the Backbone to the Hybrid Approach
In basic terms, agility is defined as the property of a system by which it can adapt swiftly to new approaches. Similarly, the crypto-agility refers to the characteristic of an information security system to swiftly switch over to alternative cryptographic algorithms and primitives. Crypto-agility not only encourages system development and evolution but also acts as a safety measure or incident response mechanism. It is highly recommended by NIST to shift to crypto-agile architectures and follow the design strategies/principles which encourage/support crypto-agility by incorporating the latest and most secure cryptographic algorithms and key lengths. For example, the systems with symmetric and hash algorithms which can easily shift to higher key lengths are crypto-agile. Newer systems should be developed with such an approach that they can be easily shifted to newer algorithms and crypto primitives with least effort and time. As soon as the post-quantum algorithms are standardized and published, organizations will start incorporating them in their products so that user can switch over to the secure ones in case of algorithm break/compromise ensuring/achieving crypto-agility.
References and Further Reading
- Read more on Post-Quantum Cryptography (2018 - today), by Terry Anton, Utimaco and more, with guest contributions by the Institute of Quantum Computing, Samsung, Entrust Datacard and many independent security experts
- NISTIR 8105 Report on Post-Quantum Cryptography (2016), by Lily Chen, Stephen Jordan, Yi-Kai Liu, Dustin Moody, Rene Peralta, Ray Perlner, Daniel Smith-Tone, the Computer Security Division, the Applied and Computational Mathematics Division, the Information Technology Laboratory at the National Institute of Standards and Technology
Blog post by Dr. Ulrich Scholten
About the author
Ulrich Scholtenは、国際的に活躍する起業家、科学者です。同氏はITの博士号を取得しており、クラウドベースのセンサーに関するいくつかの特許を所有しています。クラウドコンピューティングに関する彼の研究は、評価の高いジャーナルや会議論文に定期的に掲載されています。2008年から2015年までは、KITとIBMのパートナーシップによるKarlsruhe Service Research Institute(KSRI)のアソシエイトリサーチサイエンティストを務め、SAP Researchと共にウェブプラットフォームを中心としたネットワーク効果を研究しました。