By: John Levy, Co-Chief Executive Officer at SEEQC

ENIAC, the famous acronym for the Electronic Numerical Integrator and Computer, was in terms of computational power, one of the greatest breakthroughs of its time. The 50-by-30-foot machine was designed and funded by the United States to aid in the war effort where the slightest technological advancement could make or break a mission.

Although complex, expensive and labor-intensive to operate, the ENIAC was the closest thing to a general-purpose electronic digital computer the world had seen to date. While the machine was built for specific military computing tasks, its logic system could be recalibrated to solve a variety of other problems.

Promise vs practice

In terms of capability, significance and application, today’s quantum computers are roughly at the ENIAC stage of their development. Companies like Google and IBM have built machines that have the power to demonstrate how the world will solve previously intractable problems in new ways in data-intensive industries like chemistry, financial modeling and logistics.

But in their current stage, these machines are still a long way away from real-world adoption. Much like early classical computers, today’s machines are prohibitively expensive and impractical for commercial use. The space and cost required to house even a 50 qubit machine is prohibitive for all but the largest enterprises, and the computational power of a cloud-based system is useful for experimentation and exploration but far from that required for enterprise-grade computing.

The “steampunk chandeliers,” as some have taken to calling them, require hundreds of expensive coax cables, specialized and expensive analog equipment and are susceptible to interference of various kinds. On top of that, quantum computers are highly sensitive to their environment and steps need to be taken to control both external and internal noise.

While the sheer computational potential of these machines is not in question (see Google’s Quantum Supremacy accomplishment) the industry’s current ideas about QC architecture to deliver affordable, stable and scalable machines is. With hundreds of millions of dollars of investment at stake, some are beginning to question whether short-term expectations have been set too high for the technology.

The temperamental nature of quantum mechanics

In their current iteration, quantum computing systems do not provide the efficiency, stability or control features that are necessary to meet ROI goals for businesses. Google’s current machine is operating with about 50 qubits, but with talk of a new, much more ambitious goal of a 1 million-plus qubit machine, bigger and faster machines won’t necessarily translate into better business results; they’ll just be bigger and more expensive. And bigger doesn’t mean scalable, especially in an environment in which many qubits have such a high rate of error.

In addition to cost and control challenges, the industry has also yet to solve the energy challenges of managing the heat generated by microwave pulses to control hundreds of qubits, not to mention millions. Qubits need to be kept at near absolute zero temperatures to function, a serious challenge for any company hoping to scale their machines.

Checking hype, delivering machines that work

Fortunately, the computer industry improved on the groundwork set by the ENIAC machine more than 70 years ago. Powerful general-purpose computers changed every industry and economy on the planet. Once the industry finds a way to better manage and scale them, quantum computers will too. Until then, the question remains: at what pace can the industry innovate and deliver a commercially scalable quantum computer? How quickly can this happen?

These are questions that SEEQC is tackling and will be sharing more details with you soon.