Turning quantum theory into reality
For decades, quantum physics was stuck in the realm of ideas and theories.
But today, our scientists are using those ideas and theories to forge the future of sensors, computers, data security and imaging technology. Here are some of the ways our experts are harnessing the power of quantum technology.
- Superconducting qubits
- Quantum process verification
- High-speed, low-power memory
- Quantum key distribution
- Low-probability-of-detect communications
- Quantum imaging testbed
Qubits, or quantum bits, can solve problems that are too big or too complicated for today's computer processors. Unlike classical computing bits, which work by answering long strings of yes-or-no questions, qubits work together to answer batches of those questions at the same time. That method of processing is especially effective in areas such as data modeling and advanced encryption.
Advances in quantum computing now allow us to engineer systems with multiple qubits. The scientists in our state-of-the-art cryogenics laboratory are studying the implementation of quantum computing with superconducting-circuit-based qubits. We have developed custom hardware called an arbitrary pulse sequencer to control large numbers of qubits and program long sequences of quantum gates. We have also developed a low-latency feedback system with a field-programmable gate array – a key capability for implementing error correction.
Quantum process verification
One problem in quantum computing is that whenever we add a qubit to a computer, we see an exponential increase in the amount of time it takes to map out the machine’s activity. The existing technique known as quantum process tomography will become impractical with the next generation of quantum devices – those that have 10 qubits or more. We are working on a way to characterize quantum devices much faster, in what’s known as polynomial time. This, in turn, will make the characterization of larger-scale quantum devices practical.
Another problem is that the measurements produced by quantum process tomography are inherently inaccurate; the method sometimes confuses the process it is measuring with the act of measurement itself. We have developed a method called randomized-benchmarking tomography to greatly reduce this problem.
Once we have full-scale quantum computers, we’ll need to know how to tell them what to do. Our work in quantum algorithms spans three key areas:
- Quantum walk algorithms, which are useful for problems that require searching large data sets;
- Algebraic algorithms, which can crack encryption systems;
- Adiabatic algorithms, which solve complicated and dynamic problems, such as programming machines to consume only the smallest amount of power necessary.
Energy-Efficient Classical Computing
High-speed, low-power memory
Very powerful computers don’t have to consume massive amounts of energy. We are exploring low-power technologies including the integration of superconducting circuits with spintronics, an alternative to traditional electronics that uses an electron’s spin instead of its charge. We’re also working on spin transfer torque random access memory, which can operate at superconducting circuit temperatures while providing fast, high-density, low-power random access memory located near cryogenic processors.
Quantum key distribution
Most of today’s data encryption systems rely on one assumption: that the adversary lacks a computer powerful enough to factor large numbers. But as those computers become more powerful, encryption becomes weaker. A technology called quantum key distribution can solve that problem. With this technique, two distant parties connected by an optical channel can share a random key written with quantum physics rather than complex math. Those keys can later be used to encrypt and decrypt communications.
We completed the first metropolitan-scale quantum key distribution network in 2005, linking Harvard University, Boston University and BBN Technologies. Our recent research in quantum key distribution includes the discovery of what’s known as the Takeoka-Guha-Wilde bound, which governs the fundamental tradeoff between rate and distance for quantum key distribution.
Now we’re working on technology called a quantum repeater that would help break that boundary.
Enemies can’t eavesdrop on a conversation they don’t know is happening. That’s the idea behind low-probability-of-detect communications. Our research in this area shows it is possible to encode a message onto photons, or particles of light, then hide it inside the normal noise that occurs in the communication channel.
Quantum imaging testbed
Laser radar is a powerful tool for public safety agencies, militaries and other organizations to create quick and accurate maps of an area. Our experimental quantum imaging testbed helps us get the most information from these systems with the smallest amount of power necessary. The testbed can measure baseline photon information efficiency, or the amount of information we can write onto photons, in coherent and non-classical states. It can encode arbitrary amplitude masks on light. Its transverse imaging setup uses a spatial light modulator and an Andor Luca electron-multiplying charge-coupled device camera.
This area promises radical changes in our ability to sense, compute and process information. In nanophotonics, we take devices such as lenses and prisms, then shrink them to the micro- and nano-scale, where they are far more effective at manipulating light. This, in turn, will help us use photons -- particles of light -- to transmit large volumes of data very far and very fast, over ultra-large bandwidth and optical frequencies. This technology will one day lead us to stronger sensors, higher-resolution sensing and imaging, LADAR, secure commuinications and ultrawide-band signal processing.
This document does not contain Technical Data or Technology controlled under either the U.S. International Traffic in Arms Regulations or the U.S. Export Administration Regulations. E17-VJ52