From PLANT West: D-Wave’s Quantum leap

D-Wave puts the super into complex computing.

October 16, 2013   by Matt Powell, Assistant Editor

An advanced technology company in BC has successfully executed a quantum leap in the computing universe that has the potential to solve problems far beyond the capabilities of digital systems.

D-Wave Systems Inc. makes “quantum” computers: 10 square-metre super instruments that perform mind-boggling  (to non physicists) computing in a fraction of the time a conventional system would do it.

The Burnaby, BC-based company’s list of clients include Lockheed Martin, Google and NASA, and they’re producing results showing their system does the math at an incredible speed. Lockheed, a global aerospace, defence, IT and advanced technology manufacturer, is using the system to speed up complex computation tests.

D-Wave’s innovative computer makes direct use of quantum mechanical phenomena, such as superpositioning and entanglement, to perform operations on data. It’s different from a digital computer based on transistors that requires data to be encoded in binary digits (1 and 0). The quantum computer employs qubits, which exist in two states – on and off, simultaneously – to speed up calculations. It’s actually an adiabatic computer that reads out the ground state of its qubits to find a solution. The model is well suited to solving optimization problems where any number of criterion is examined simultaneously.


To put things into perspective, such a super computer would figure out all the factors of a 2,000-bit number in about 16 hours – that’s 3,600 times faster than a conventional computer doing the job in about 10 years.

The systems can also be taught to recognize specific objects in images.

The D-Wave innovation has its critics who have been dissatisfied with the company’s experimental proof of quantum entanglement inside the devices, but deals with giants such as Google, NASA and Lockheed Martin seem to have cooled some of the heated skepticism.

Scott Aaronson, a professor at MIT, originally said D-Wave’s demonstrations didn’t prove anything about the workings of the computer, and that a quantum computer would require a major breakthrough in physics. After a 2012 visit to D-Wave’s headquarters, he retired his self-described title of “Chief D-Wave Skeptic.”

“We’ve invited those skeptics to our facility to kick the tires and talk to our scientists, and a lot of them leave excited about the prospects of our technology,” says Jeremy Hilton, D-Wave’s vice-president of processor development. “There’s still a lot of unknowns in what we do, and that’s part of the challenge, but we’ve seen enough success that our investors are still excited about what we’re doing and we’ve kept the doors open.”

D-Wave was founded in 1999 by Haig Farris, Bob Weins, Alexandre Zagoskin and Geordie Rose, the company’s current chief technology officer. It originally operated as an offshoot of the University of British Columbia, maintaining ties with the Department of Physics and Astronomy, and funding academic research in quantum computing.  With only a handful of employees, all PhDs and theoretical physicists, the company was set on exploring the idea that quantum computers could solve the unsolvable.

Breakthrough technology

Until 2004, D-Wave focused on uncovering the best applications for the technology, says Hilton. “That’s when we had a breakthrough.”

The team committed to an application-specific processor technology that would solve one particular problem. The D-Wave One system came equipped with a 128-qubit processor that performed discrete optimization operations using quantum annealing.

The first processors were manufactured at NASA’s Jet Propulsion Lab where semiconductor fabrication capabilities are advanced.

By 2006, D-Wave had partnered with a Silicon Valley semiconductor producer with enough capability to handle the complex parts. Manufacturing is still performed in California, but the testing and qualification of the processors keeps 70 employees busy at D-Wave’s Burnaby headquarters.

“The chips in conventional computers are based on transistors, but ours are like tunnel junctions. By the time we got to the 128-qubit processor, it housed 24,000 tunnel junctions – that’s a huge jump in the manufacturing and production of these things,” says Hilton. “These processors are mixed analogue and digital circuits which present exceptional production challenges.”

Quantum annealing is a mathematical operation where the extreme points of a function are determined from a given set of discrete candidate solutions, a process that’s similar to quantum fluctuations, which is a phenomenon predicted by the Heisenberg Uncertainty Principle.

The computer is housed in a seven-by-11-foot shield room with a refrigeration component (which keeps the processor cooled to a chilly absolute zero degrees or -273.15 degrees C), and peripheral electronics to control the processor. Three 19-inch racks support the data server and fridge control infrastructure.

“Most people wanted to build a universal machine that solved every issue, but that made us a little nervous,” says Hilton. “Once we started researching optimization, we realized a lot of the problems we were looking at …  are difficult, even impossible to solve so people make approximations that render the solution meaningless.”

Such problems are exactly what Lockheed Martin is tackling with D-Wave’s quantum computers. It signed a multi-year contract with D-Wave to uncover the benefits of such advanced computing using the D-Wave One 128-qubit system to address the verification of validation.

“When you design a system that has millions of lines of code, it takes an incredible amount of time to validate it,” says Brad Pietras, vice-president of corporate engineering and technology at Lockheed Martin.  “Testing can take years or decades on the best system, but if you can test them simultaneously, it cuts that time significantly.”

In April, Lockheed upgraded its D-Wave One system to a 512-qubit D-Wave Two, said to be 500,000 times faster than its predecessor. The computer, reported by a US media outlet to have been purchased for $10 million (although this number is not confirmed by Lockheed or D-Wave), will be installed at the aerospace giant’s new Quantum Computation Centre at the University of Southern California in Los Angeles.

The processor is made from a lattice of minute superconducting wires, then programmed with a set of mathematical equations. It speeds through a near-infinite number of possibilities to determine the lowest energy required to form a relationship, which produces an optimal outcome.

“Building a new plane, which is almost completely driven by software processes and interacting with automated mechanical systems, [requires a tremendous] amount of code so having the ability to validate that code in a shorter amount of time not only speeds up testing, it also makes those systems safer and gets new products to market faster,” explains Hilton.

While Pietras won’t divulge specific applications, Ray Johnson, Lockheed’s chief technical officer, told The New York Times the company is exploring ways the technology could create and test complex radar, space and aerospace systems. An extreme example would be telling how the millions of lines of software running a network of satellites react to a sudden solar burst or a pulse from a nuclear explosion in hours instead of days.

Potential for manufacturers

So far, Pietras says the computer has perfomed as it’s supposed too. “The challenge now is understanding how the computer works and the boundaries of its performance, but D-Wave’s done a good job with its 512-qubit processors at identifying that trajectory of performance.”

Hilton is confident Lockheed won’t be D-Wave’s only manufacturing customer, suggesting the system is able to solve more than super-complex back-end code issues.

Other functions for manufacturers include:

• Classification and anomaly detection that permits a predominantly machine-based monitoring infrastructure to independently detect significant events and respond accordingly, which could be user-defined or data driven.

• The efficiency of scheduling and logistics. Classic examples include courier delivery optimization (minimizing fuel costs and resource time), airline scheduling and responsiveness to change events, or even inventory use and management.

• Software verification and validation. “This can involve anything from individual processes (such as operation of a temperature bath or specific-purpose tool/machine), or infrastructural software that coordinates and monitors overall factory performance,” says Hilton.

• Network analysis and optimization. “This type of static or adaptive optimization is used to manage network distribution or flow, and is a major challenge in areas such as power or water management,” says Hilton.

D-Wave’s progress to date brings the company to the starting line. The computer’s applications will continue to evolve as more customers get their hands on it to unleash hidden value in their operations.

“If you look at technology’s feedback cycle, new technologies will always be absolutely crucial and innovation is always possible,” says Hilton, who is fond of an Isaac Newton quote, ‘If I have seen further, it is by standing on the shoulders of giants.’”

Comments? E-mail

This article appears in the September/October edition of PLANT West.

Print this page

Related Stories