With the quest to build a full-scale quantum computer now expanding beyond the science lab to encompass technology giants (such as IBM, Google, Microsoft) and quantum computing start-ups (including Rigetti, IonQ and 1Qubit), the race is on to create the hardware and software powering the next generation of computation.
Quantum simulation looks set to revolutionize chemical and pharmaceutical companies. AI efforts could be enhanced with quantum machine learning, posing huge opportunities for businesses to optimize how they use their data. But as the charge towards a universal quantum computer starts to unlock these benefits for business and society as a whole, the threat to cybersecurity systems around the world looms larger. This is because quantum computers, once built up to their full potential, will have the ability to solve particular kinds of hard problems, and one of these problems is at the root of protecting the world’s information. The excitement about the potential of quantum computers also brings worry about our current encryption methods.
Many security systems protocols – such as RSA, Diffie-Hellman or Elliptic-Curve Cryptography – use public-key encryption techniques based on hard mathematical problems like integer factorization. They’re the most popular and prevalent cryptographic algorithms due to their efficiencies, and they’re the industry’s precedent, built into early versions of the Secure Sockets Layer (SSL) protocol, for example.
The power in the integer factorization problem, and thus the reliability of these methods for our standard security efforts, is that it simply takes conventional computers too long to compute the answer to break the encryption. Quantum computers, with their completely different architecture and theoretical ability to solve the famous Shor’s algorithm, once built to their full-scale versions will be able to solve integer factorization problems at speed and put these pervasive conventional security methods at risk.
No-one has come close to building a universal quantum computer capable of running Shor’s algorithm, and so conventional security methods are safe from quantum computing right now. But standards organizations are on the hunt for the new quantum-safe algorithms right now, so they can be adopted globally across the industry in time for a future quantum revolution. The National Institute of Standards and Technology put out a call out to researchers in 2016 to submit post-quantum algorithms, and only this year narrowed it down to the final 26 candidates. They’re still evaluating which should become the new standard.
In some sense, this pace is understandable as there’s no immediate rush right now – we are still far away from having a full-scale universal quantum computer. The most advanced versions released by universities and companies in 2019 are extremely nascent in comparison to what will be needed to crack Shor’s. But on the other hand, no one can predict the future, and it’s true that many of the finest mathematical, engineering and computer science minds are working hard to build quantum computers. To start the subsequent global rollout of the new algorithms, having the regulation and agreed standards in place should be a priority if companies and governments are not to be taken by surprise.
Quantum-safe algorithm systems
Quantum-safe algorithms need to be based on mathematical problems which are not only too difficult for conventional computers to solve in a reasonable time but also too tricky for quantum computers. Most of these algorithms can be grouped into three families of problems: lattice cryptosystems, code-based systems and multivariate systems.
Lattice-based algorithms come from a core problem called the Shortest Vector Problem (SVP), which is about finding the smallest non-zero vector within the lattice. If you take a two-dimensional lattice (a grid of regularly spaced dots you could draw on a piece of paper) and draw a line from the bottom left-hand corner (zero) to a dot in the grid, you’ll have a vector. Vectors can be combined using addition, subtraction and multiplication to traverse the lattice and find routes to other dots, in the process, creating new lines from the zero point to the new destination. The Shortest Vector Problem asks you to take a particular vector and work out how to multiply, add and subtract combinations of it to get to the dot closest to zero. This sounds like an easy enough problem to solve, particularly if you are working with a two-dimensional lattice, but as you add in multiple dimensions – there could be up to 10,000 – the problem quickly becomes too difficult for both conventional and quantum computers, making it a compelling candidate for quantum-safe cryptography.
Code-base cryptosystems are based on a tough mathematical problem called syndrome decoding, which is linked to error correction. Codes for error-correcting are used to fix mistakes in the code, such as a 1 that should be a 0 or vice versa, which inevitably occur during transmission of digital messages. These are essentially secret decoding functions which only the person in possession of it can use to recover the original message, in the same way most forms of cryptography have worked throughout the ages: from Caesar’s men running secret messages to the Roman battlefields where the generals already had in their possession the recipe to translate them from gibberish, to the secret decoding techniques of the German war communicators that Alan Turing’s Enigma machine managed to crack.
These error-correcting codes are usually used in situations where it would be cumbersome or costly to retransmit messages to work out where the errors are, for example, receiving data from faraway spacecraft, which can take hours to transmit back to Earth – not something you’d want to do many times. Instead, the error-correcting code is essentially baked into the message by deliberately creating ‘errors’, or scrambles of the original message, within. Without the recipe on the other end to decode this scrambled message, it’s computationally too hard for both standard and quantum computers to reverse the errors and the encoding to crack the message. The most popular of these error-correcting codes is called a Goppa code, which could be a way to save security professionals from quantum-powered attacks.
The clue is in the name here, as these algorithms are based on the difficulty of solving systems of equations with multiple variables. If you remember trying to solve quadratic equations in high school – which only have two variables, say x and y – it soon becomes clear how difficult it can be to solve systems with three, ten or even hundreds of variables. The most promising systems include the creatively named Unbalanced Oil and Vinegar (UOV) scheme and the Rainbow scheme; these can be used for digital signatures, just two options among many in the quest to replace our current public-key methods.
Future-proofing encryption standards
The direction of quantum encryption practice isn’t to find ‘one quantum-safe algorithm to rule them all,’ though. As the history of cryptography has shown us – from the ciphers used in the Spartan military to the Bletchley Park code-breakers in World War II – old methods become useless as researchers become smarter. The algorithms being put forward by mathematicians, cybersecurity researchers and quantum computing scientists as quantum-safe standards may, one day, be proven to be not so quantum-safe after all. The scientists of the future might find other ways to break these algorithms, just like Shor’s has been found as a solution to break integer factorization.
But the uncertainty about the future safety of these new algorithms doesn’t remove the need to make our security standards tighter now based on our knowledge that quantum computing will pose a huge threat to current systems. As emerging technologies reach the market, we need to make standards stronger to enable innovators to make the most of these incredible new inventions, as opposed to feeling threatened and missing out on their potential to improve the state of the world.
The cybersecurity industry must be comfortable questioning its confidence in existing methods. It must ensure the traditional, sometimes slow-moving institutions in charge of standards and regulations move more quickly. The industry cannot fall into the trap of assuming quantum computing is still too far away to be a threat to ‘business as usual.’
It’s only all of the world’s digital information that’s at stake, after all.
This article represents the personal opinion of the author.
Article published in 2019.