In this article, we'll be exploring a specific upgrade to one of the most common pieces of technology used today, computer systems. Their replacement? Quantum computers.
MIT and Google, in partnership, have introduced a method that strongly suggests the higher efficiency of quantum computing in comparison to the computational capabilities of traditional computers. The main aim of the research partnership was to analyze complex computations that have been accurately verified by quantum chips when comparing them to classical computers.
Developing a Protocol to Verify Quantum Chip Efficiency
The computations in the research study were conducted through quantum chips using quantum bits (known as âqubits). The classical binary bits of 0 or 1 were represented by quantum chips called the âquantum superpositionâ of both states. In that way, quantum chips were found capable of solving problems that are considered impossible for traditional computer systems.
The MIT and Google researchers also demonstrated a protocol that validated if a NISQ chip (Noisy Intermediate Scale Quantum), also developed by their team, has performed all the required quantum operations. They validated the performance of the protocol with a complex quantum problem that was running on a custom quantum photonic chip. Jacqueus Carolan from the Department of Electrical Engineering and Computer Science (EECS) and the Research Laboratory of Electronics (RLE) shared, âOur technique provides an important tool for verifying a broad class of quantum systems. Because if I invest billions of dollars to build a quantum chip, it sure better do something interesting.â
A custom system developed by MIT and Google researchers to validate methods for verifying quantum chips. Image Credit: Mihika Prabhu via MIT.
The method developed by the researchers used the following two techniques:
Divide and Conquer
The researchers used the âdivide and conquerâ approach in one of two techniques for the experiment. The âDivide and Conquerâ approach is present at the core of their new protocol known as âvariational quantum unsamplingâ.
The "Divide and Conquer" technique involves breaking down the output quantum state into chunks. They unscrambled it layer by layer instead of processing the entire thing in one shot. The inspiration behind this technique originated from neutral neural networks. Neural networks are known to solve problems by using multiple computation layers. Therefore, the research team developed a novel protocol called âquantum neural network â QNNâ.
Each layer of QNN represented a set of quantum operations. This took less time when compared to the task of processing at one time. This approach caused the breakdown of problems into sub-slots and it ultimately helped the researchers solve complex problems more efficiently.
Under this approach, traditional techniques of silicon fabrication were used for developing a 2-by-5-milimeter NISQ chip with more than 170 control parameters. In an effort to manipulation the photon path, adjustable circuit components were used. An external component injected into the chip was also used for the generation of photonsâ pairs at specific wavelengths. Photons change their path creating an interference with each other by means of chipâs phase shifters. Through this processing, a random quantum output state is produced, which is a representation of what will happen during computation. An array of photodetector sensors connected externally was used for the output measurement.
Output Processing: Using the Divide and Conquer Approach
The obtained output was sent to the QNN i.e. quantum neural network. The complex optimization techniques were therefore used to separate the noisy output. These techniques pointed out the single photonâs signature present among scrambled ones. After this step, the unscrambling step comes in. It identifies the single photon from a group and validates the circuit operations that will return it back to its known input state. This is how operations matching took place on all the subsequent layers that were at the base of computation.
The second technique used by the researchers was âboson samplingâ. This approach generally applied to photonic chips. This technique involved optical components and phase shifters that were used for manipulating a set of input photons into a different quantum of output photons superposition. The main objective behind this task was a probability calculation stating a certain input state was going to match to a certain output state.
Google's 'Sycamore' quantum processor that associated them with 'quantum supremacy'. Image Credit: Erik Lucero.
Existing Problems with Classical Computation
Fullscale quantum computers actually need millions of qubits and so far, that isn't feasible. Keeping this problem in mind, researchers started working on the development of âNoisy Intermediate Scale Quantum (NISQ)â chips that comprise of about 50 to 100 qubits.
Though, it isn't an entirely efficient strategy, it's just enough to demonstrate âquantum advantageâ. This implies that NISQ chips can actually solve algorithms that traditional computers are incapable of solving. The two approaches used by the researchers were computing samples, it is impossible for a traditional computer to compute them. This is mainly due to the reason that photons show unpredictable behavior photons. Therefore, the use of NISQ chips has suggested that their computation is quick and efficient.
Applying the Quantum Verification Method to "Physical World" Problems
Hopes for the future potential of the research is to contribute to the ongoing effort of making the quantum verification process as quick and efficient as possible. The research team also hoped that the findings would apply to other problems in the physical world.
The new methods demonstrated by MIT and Google were dedicated to verifying the fact that quantum chips can deliver efficient performance and accurate computation. They succeeded in demonstrating that a quantum system could perform complex operations that traditional computers weren't capable of.