Quantum Leap: The Quantum Computing Revolution
Quantum computing is the term used to encompass the massive change underway in both computer hardware and digital computation. The basis for this revolution in computation is in the way in which information is processed by the hardware. While still in its infancy, quantum computing is poised to fundamentally change the way in which we interact with the digital world.
Currently, computers represent information in bits in a binary fashion, either as a one or a zero. A collection of bits, called a string, can be used to represent things like letters of the alphabet; using the American Standard Code for Information Interchange, {01000001} is the code for the letter A, for example. Using this binary system of bits, computations are done either serially (one after the other) or in parallel (multiple calculations are broken into multiple subtasks which are then worked through at the same time).
Quantum computing, on the other hand, deals with qubits, or quantum bits, instead of the traditional bit. Calculations involving qubits are worked through simultaneously, without breaking the computations into subtasks to be run in parallel processing models. In other words, using the current bit-based model, three bits together can be represented in eight possible combinations: 000, 001, 010, 011, 100, 101, 110, and 111. In the qubit-based model, all eight possible combinations of three qubits can be represented just by the qubits themselves, meaning that a three-qubit notation can encompass all the possible combinations of 1s and 0s.
In cases involving probabilistic computing (i.e. determining which combinations, out of all possible combinations of qubits, are the “correct” ones), the qubits allow for simultaneous computation without the need to break down each computation into a smaller subtask before computing each outcome, and then having to compare or identify the “correct” combination. This represents an enormous savings in computational time in many circumstances, once the number of bits or qubits in the calculation begin to increase. The greater the number of bits/qubits, the easier it is to see the time savings gained, computationally speaking, in many calculations. The progression of possible combinations using bits is expressed using the formula 2n: 2 bits = 4 combinations, 3 bits = 8 combinations, 4 bits = 16 combination, 5 bits = 32 combinations, and so on. While combinations can be worked through in parallel using bits, there is no need for the generation of subtasks when using quantum computing.
This means that for practical tasks, like encrypting and decrypting information, many possible passwords or character combinations can be checked in the fraction of the time it takes for current computers to work through them. Current encryption methods may no longer prove to be secure for long-term storage (think iCloud, GoogleDrive, DropBox, etc.). Current transaction-based encryption, like that used for online banking transactions, would still be relatively secure as the strength of the encryption is only necessary for as long as it takes for a transaction to be processed and the results recorded and posted. However, information security, both as a practice and as an industry, will require a paradigm shift in order to retain the current level of users’ expectations of privacy. To learn more about computing and technology earn your BS in Information Technology.