With this month?s column, we start a 12-part series called Navigator MasterClass, wherein we will find our way through the myths and realities of one Bleeding Edge technology each month; in terms of where it truly stands at the time of writing, and its business applications – implemented, being attempted, or speculative. Rest assured that this will NOT be an educational initiative. We will, however, spend a few lines to set up common understanding of the technology.
The technology for this month is Quantum Computing (QC). Like any emerging technology (or even a concept), there is a lot of hype without real understanding in the early days. If one was to Google ?Quantum Computing?, the 2 billion+ results can be broadly classified into five categories:
- Quantum made easy
- Why is Quantum so difficult to understand?
- Quantum for IT departments
- Particle Physics theory
- Applications (under development or being speculated/planned)
The reason Quantum Computing is so difficult to explain or understand is that it is based on the esoteric science of Quantum Physics. And that is nowhere as easy to learn or even teach as the Mathematical models used in Classical Computing. What we will do now is understand the challenges and why it may be more than a few years before QC goes mainstream. We will also look at some business applications being planned/built as we speak/read/write.
Most of us know that QC uses Qubits, which are bits that exist in multiple states simultaneously (as against the classical computing bit that can be either ?0? or ?1? only. In popular literature(!) this concept called Superposition, is defined as being in ?0? and ?1? simultaneously. In reality, Superposition is a ?complex linear combination?, that assigns amplitudes to ?0? and ?1?. And through interactions, it leads to ?destructive? or ?constructive? interference; until it is destroyed (?0?), or it survives (?1?). The whole world of QC algorithms is based around crafting and sequencing of constructive and destructive interference.
Also, QC is good when the problem at hand shows ?scaling behavior?; i.e., each step leads to an exponential growth in subsequent steps. So, it is not the challenge of computing all possible moves in chess (which supercomputers can do today), but it is the challenge of the traveling salesman problem or the Chinese postman problem.
What does all this mean for Business and its IT? First of all, there are no metrics to measure the progress of QC research. Most big-name research organizations are focused on number of Qubits. While this is important, it is the Gate Errors that are being ignored except perhaps by Google?s Sycamore and USTC in China. The potential of QC can be realized only with minimal gate errors; this can only be done by the classical techniques of repetition, replication, and redundancy. This error correction will use up some Qubits as gatekeepers; until we know what that will take, there is no telling what power is available.
Another challenge that will keep QC some time away from business is of ?Decoherence?. A slightest vibration/sound/light turns a Qubit into a good old bit. So systems engineering is looking at solutions like special environment of near zero temperature or of absolute vacuum. Once this is stabilized, cloud computing will bring QC to business world.
The third challenge is one of QC industry?s own creation; that of race to ?Quantum Supremacy?. This essentially requires that QC does a mathematical calculation that is beyond the reach of most powerful supercomputer. This has not been achieved yet; the task being made more difficult by supercomputing advancing on its own (thus making this goal a moving target).
Before we look at ongoing Business IT work using QC, let us understand that QC will not replace classical computing, i.e., the mobile phone will not start using Qubits instead of bits. In fact, for many applications, QC will be slower than classical computing. QC will be used in cases that demonstrate Scaling Behavior that we talked about earlier.
MIT Technology Review lists two types of applications of QC, the first being ?simulating the behavior of matter down to molecular level?. Volkswagen and Daimler are looking to analyze EV batteries through simulation to improve performance. Pharmaceutical industry is looking at understanding compounds better to create new drugs. The second type of applications will be of Optimization. Airbus is looking at fuel-efficient ascent and descent paths, and Volkswagen is offering a service to reroute buses and taxis to minimize city traffic congestion.
Harvard Business Review adds to this list with three new possibilities. First is Accelerated AI and Machine Learning (for example, Netflix?s recommendation engine). The second is unstructured search, useful (for example) in microbiology for insights into genetic disorders or epidemics management.
The third type is Factoring replacement. This is interesting because today, all passwords, encryptions use a technique called Factoring, and QC can break every password and encryption on this planet in mere seconds. This has spurred a whole new field of study called ?postquantum cryptography?. And by the way, if this does come into being, all the data in this world will need to be re-encrypted and the new infrastructure (if required) to support it will need to be set up!
McKinsey has opined that two sidebars of QC may see business use very soon. They are Q Sensing (QS) and Q Communication (QComm). QS provides more acute measures and has obvious multiple uses from bioimaging to navigation to environment monitoring to geographical surveys, etc. QComm will help in better encryption and in signal amplification.
So, there is hype, excitement, and confusion; all at the same time. But the research money going into QC is very large. And the governments are joining the race. In addition to US and China competing for supremacy, two other examples are:
- World Economic Forum has set up a ?Global Future Council on Quantum Computing?
- Indian government has committed INR 8,000 crores to QC research and set up ?National Mission on Quantum Technology and Application? (NM-QTA)
The best way forward for today?s IT leaders is to keep an eye on progress (a breakthrough can cut short the time, just like it did for AI), and to start thinking about which of their business applications can use QC (prepare for the breakthrough).
The author managed large IT organizations for global players like MasterCard and Reliance, as well as lean IT organizations for startups, with experience in financial and retail technologies