Whitepaper

Whitepaper

Whitepaper

The Dawn of the Quantum Era

The Dawn of the Quantum Era

The Dawn of the Quantum Era

How Quantum Computing Will Reshape the World

How Quantum Computing Will Reshape the World

How Quantum Computing Will Reshape the World

This whitepaper explores why scaling—not just building—the first quantum computer will determine who leads the next era of technology. From accelerating drug discovery and clean energy to securing communications and transforming AI, SEEQC outlines the breakthroughs, risks, and policy actions that will define the future.

This whitepaper explores why scaling—not just building—the first quantum computer will determine who leads the next era of technology. From accelerating drug discovery and clean energy to securing communications and transforming AI, SEEQC outlines the breakthroughs, risks, and policy actions that will define the future.

This whitepaper explores why scaling—not just building—the first quantum computer will determine who leads the next era of technology. From accelerating drug discovery and clean energy to securing communications and transforming AI, SEEQC outlines the breakthroughs, risks, and policy actions that will define the future.

Listen to the Audio

Listen to the Audio

Executive Summary

Executive Summary

Executive Summary

Quantum computing is not just another innovation. It is a new frontier. One that has the potential to serve as a foundational platform for science and technology in the coming decades, accelerating progress across nearly every domain, from materials discovery and clean energy to advanced medicine and national security. While much of the attention has focused on building the first quantum computer, the far more consequential challenge lies in scaling these systems to practical, usable levels. Only then can the technology’s full potential be realized.

This paper begins by tracing the historical arc from classical computing’s microprocessor breakthrough to today’s quantum frontier. It will then provide a basic overview of what exactly quantum computing is, outlining the principles that make it so unique, such as superposition, entanglement, and interference. We will see how these capabilities enable dramatic advances in fields ranging from chemistry and cryptography to AI and energy.

Following this overview, the paper will delve into the critical issue of scalability, highlighting why integration of quantum processors, classical control electronics, and interface systems onto a single chip is essential to making quantum systems viable.

Finally, the paper reviews the current U.S. policy landscape and identifies gaps in manufacturing capacity, supply chain resilience, and commercialization support. It evaluates emerging legislative and agency efforts, including DARPA’s Quantum Benchmarking Initiative and DOE’s proposed infrastructure bill, and argues for a more coordinated national strategy.

The United States’ dominance in classical computing wasn’t an accident. It was the result of strategic thinking, significant investment, and a conducive innovation ecosystem. But the classical era is coming to an end and we are entering the dawn of the quantum era. The writing is on the wall: quantum computing will reshape the world in extraordinary ways. And whoever learns how to scale it first will be at the helm. The United States must act now to ensure that leadership in quantum research is matched by leadership in deployment.

Quantum computing is not just another innovation. It is a new frontier. One that has the potential to serve as a foundational platform for science and technology in the coming decades, accelerating progress across nearly every domain, from materials discovery and clean energy to advanced medicine and national security. While much of the attention has focused on building the first quantum computer, the far more consequential challenge lies in scaling these systems to practical, usable levels. Only then can the technology’s full potential be realized.

This paper begins by tracing the historical arc from classical computing’s microprocessor breakthrough to today’s quantum frontier. It will then provide a basic overview of what exactly quantum computing is, outlining the principles that make it so unique, such as superposition, entanglement, and interference. We will see how these capabilities enable dramatic advances in fields ranging from chemistry and cryptography to AI and energy.

Following this overview, the paper will delve into the critical issue of scalability, highlighting why integration of quantum processors, classical control electronics, and interface systems onto a single chip is essential to making quantum systems viable.

Finally, the paper reviews the current U.S. policy landscape and identifies gaps in manufacturing capacity, supply chain resilience, and commercialization support. It evaluates emerging legislative and agency efforts, including DARPA’s Quantum Benchmarking Initiative and DOE’s proposed infrastructure bill, and argues for a more coordinated national strategy.

The United States’ dominance in classical computing wasn’t an accident. It was the result of strategic thinking, significant investment, and a conducive innovation ecosystem. But the classical era is coming to an end and we are entering the dawn of the quantum era. The writing is on the wall: quantum computing will reshape the world in extraordinary ways. And whoever learns how to scale it first will be at the helm. The United States must act now to ensure that leadership in quantum research is matched by leadership in deployment.

Quantum computing is not just another innovation. It is a new frontier. One that has the potential to serve as a foundational platform for science and technology in the coming decades, accelerating progress across nearly every domain, from materials discovery and clean energy to advanced medicine and national security. While much of the attention has focused on building the first quantum computer, the far more consequential challenge lies in scaling these systems to practical, usable levels. Only then can the technology’s full potential be realized.

This paper begins by tracing the historical arc from classical computing’s microprocessor breakthrough to today’s quantum frontier. It will then provide a basic overview of what exactly quantum computing is, outlining the principles that make it so unique, such as superposition, entanglement, and interference. We will see how these capabilities enable dramatic advances in fields ranging from chemistry and cryptography to AI and energy.

Following this overview, the paper will delve into the critical issue of scalability, highlighting why integration of quantum processors, classical control electronics, and interface systems onto a single chip is essential to making quantum systems viable.

Finally, the paper reviews the current U.S. policy landscape and identifies gaps in manufacturing capacity, supply chain resilience, and commercialization support. It evaluates emerging legislative and agency efforts, including DARPA’s Quantum Benchmarking Initiative and DOE’s proposed infrastructure bill, and argues for a more coordinated national strategy.

The United States’ dominance in classical computing wasn’t an accident. It was the result of strategic thinking, significant investment, and a conducive innovation ecosystem. But the classical era is coming to an end and we are entering the dawn of the quantum era. The writing is on the wall: quantum computing will reshape the world in extraordinary ways. And whoever learns how to scale it first will be at the helm. The United States must act now to ensure that leadership in quantum research is matched by leadership in deployment.

Intro

Intro

Intro

We are about to begin a new chapter in the story of humanity, one in which technology has endowed us with powers once reserved for the divine. We can rewrite the genetic code of life using CRISPR, engineer synthetic organisms from scratch, and inch closer to replicating the energy of the stars through nuclear fusion. Organs can be 3D-printed, carbon extracted directly from the atmosphere, and matter manipulated at the atomic scale.

Amid this cascade of remarkable scientific breakthroughs transforming the world.

In the 1980s, physicists Richard Feynman and David Deutsch proposed an idea that could reshape our future in ways even science fiction has yet to imagine. Their idea was that the power of quantum mechanics could one day give rise to a new type of computing—one far more powerful and far-reaching than anything we have ever created.

To get a sense of how powerful quantum computers could become, it helps to start with the evolution of classical computing. The Electronic Numerical Integrator and Computer (ENIAC), completed in 1945 at the University of Pennsylvania, is widely considered the first modern digital computer. ENIAC could perform 5,000 calculations per second—impressive for its time. Over the next few decades, progress was steady but incremental. Then came 1971 and the invention of the microprocessor. By compressing ENIAC’s sprawling maze of vacuum tubes and wiring onto a single chip, the microprocessor enabled the exponential scaling predicted by Moore’s Law and sparked the modern computing era. Fast forward to April 2025, and El Capitan, housed at Lawrence Livermore National Laboratory, currently holds the title of the world’s most powerful supercomputer, capable of executing 1.742 quintillion calculations per second. That’s 348.4 trillion times faster than ENIAC.

Finally, the paper reviews the current U.S. policy landscape and identifies gaps in manufacturing capacity, supply chain resilience, and commercialization support. It evaluates emerging legislative and agency efforts, including DARPA’s Quantum Benchmarking Initiative and DOE’s proposed infrastructure bill, and argues for a more coordinated national strategy.

The United States’ dominance in classical computing wasn’t an accident. It was the result of strategic thinking, significant investment, and a conducive innovation ecosystem. But the classical era is coming to an end and we are entering the dawn of the quantum era. The writing is on the wall: quantum computing will reshape the world in extraordinary ways. And whoever learns how to scale it first will be at the helm. The United States must act now to ensure that leadership in quantum research is matched by leadership in deployment.

We are about to begin a new chapter in the story of humanity, one in which technology has endowed us with powers once reserved for the divine. We can rewrite the genetic code of life using CRISPR, engineer synthetic organisms from scratch, and inch closer to replicating the energy of the stars through nuclear fusion. Organs can be 3D-printed, carbon extracted directly from the atmosphere, and matter manipulated at the atomic scale.

Amid this cascade of remarkable scientific breakthroughs transforming the world.

In the 1980s, physicists Richard Feynman and David Deutsch proposed an idea that could reshape our future in ways even science fiction has yet to imagine. Their idea was that the power of quantum mechanics could one day give rise to a new type of computing—one far more powerful and far-reaching than anything we have ever created.

To get a sense of how powerful quantum computers could become, it helps to start with the evolution of classical computing. The Electronic Numerical Integrator and Computer (ENIAC), completed in 1945 at the University of Pennsylvania, is widely considered the first modern digital computer. ENIAC could perform 5,000 calculations per second—impressive for its time. Over the next few decades, progress was steady but incremental. Then came 1971 and the invention of the microprocessor. By compressing ENIAC’s sprawling maze of vacuum tubes and wiring onto a single chip, the microprocessor enabled the exponential scaling predicted by Moore’s Law and sparked the modern computing era. Fast forward to April 2025, and El Capitan, housed at Lawrence Livermore National Laboratory, currently holds the title of the world’s most powerful supercomputer, capable of executing 1.742 quintillion calculations per second. That’s 348.4 trillion times faster than ENIAC.

Finally, the paper reviews the current U.S. policy landscape and identifies gaps in manufacturing capacity, supply chain resilience, and commercialization support. It evaluates emerging legislative and agency efforts, including DARPA’s Quantum Benchmarking Initiative and DOE’s proposed infrastructure bill, and argues for a more coordinated national strategy.

The United States’ dominance in classical computing wasn’t an accident. It was the result of strategic thinking, significant investment, and a conducive innovation ecosystem. But the classical era is coming to an end and we are entering the dawn of the quantum era. The writing is on the wall: quantum computing will reshape the world in extraordinary ways. And whoever learns how to scale it first will be at the helm. The United States must act now to ensure that leadership in quantum research is matched by leadership in deployment.

We are about to begin a new chapter in the story of humanity, one in which technology has endowed us with powers once reserved for the divine. We can rewrite the genetic code of life using CRISPR, engineer synthetic organisms from scratch, and inch closer to replicating the energy of the stars through nuclear fusion. Organs can be 3D-printed, carbon extracted directly from the atmosphere, and matter manipulated at the atomic scale.

Amid this cascade of remarkable scientific breakthroughs transforming the world.

In the 1980s, physicists Richard Feynman and David Deutsch proposed an idea that could reshape our future in ways even science fiction has yet to imagine. Their idea was that the power of quantum mechanics could one day give rise to a new type of computing—one far more powerful and far-reaching than anything we have ever created.

To get a sense of how powerful quantum computers could become, it helps to start with the evolution of classical computing. The Electronic Numerical Integrator and Computer (ENIAC), completed in 1945 at the University of Pennsylvania, is widely considered the first modern digital computer. ENIAC could perform 5,000 calculations per second—impressive for its time. Over the next few decades, progress was steady but incremental. Then came 1971 and the invention of the microprocessor. By compressing ENIAC’s sprawling maze of vacuum tubes and wiring onto a single chip, the microprocessor enabled the exponential scaling predicted by Moore’s Law and sparked the modern computing era. Fast forward to April 2025, and El Capitan, housed at Lawrence Livermore National Laboratory, currently holds the title of the world’s most powerful supercomputer, capable of executing 1.742 quintillion calculations per second. That’s 348.4 trillion times faster than ENIAC.

Finally, the paper reviews the current U.S. policy landscape and identifies gaps in manufacturing capacity, supply chain resilience, and commercialization support. It evaluates emerging legislative and agency efforts, including DARPA’s Quantum Benchmarking Initiative and DOE’s proposed infrastructure bill, and argues for a more coordinated national strategy.

The United States’ dominance in classical computing wasn’t an accident. It was the result of strategic thinking, significant investment, and a conducive innovation ecosystem. But the classical era is coming to an end and we are entering the dawn of the quantum era. The writing is on the wall: quantum computing will reshape the world in extraordinary ways. And whoever learns how to scale it first will be at the helm. The United States must act now to ensure that leadership in quantum research is matched by leadership in deployment.

Consider the difference between a Macintosh 1984 and a new MacBook. Now imagine the leap from ENIAC to El Capitan. Even that staggering contrast doesn’t begin to capture the scale of the difference between quantum computing and today’s most advanced classical systems.

Consider the difference between a Macintosh 1984 and a new MacBook. Now imagine the leap from ENIAC to El Capitan. Even that staggering contrast doesn’t begin to capture the scale of the difference between quantum computing and today’s most advanced classical systems.

Consider the difference between a Macintosh 1984 and a new MacBook. Now imagine the leap from ENIAC to El Capitan. Even that staggering contrast doesn’t begin to capture the scale of the difference between quantum computing and today’s most advanced classical systems.

Figure 1: The exponential leap from ENIAC (1945) to El Capitan (2025) demonstrates how transformative breakthroughs can reshape entire technological paradigms.

Figure 1: The exponential leap from ENIAC (1945) to El Capitan (2025) demonstrates how transformative breakthroughs can reshape entire technological paradigms.

Figure 1: The exponential leap from ENIAC (1945) to El Capitan (2025) demonstrates how transformative breakthroughs can reshape entire technological paradigms.

ENIAC and El Capitan: What 80 Years of Innovation Looks Like In a paper published in Nature on December 9, 2024, Google Quantum AI introduced Willow, a 105-qubit superconducting quantum processor. Willow successfully completed a random circuit sampling (RCS) benchmark—a test designed to measure a quantum computer’s ability to process highly complex, non-deterministic quantum circuits—in under five minutes. By contrast, El Capitan and other leading classical supercomputers would require an estimated 10 septillion years (10²⁵) to complete the same task. That’s approximately 7.25 quadrillion times longer than the age of the universe.

While this comparison applies to a specific class of problems, it offers a glimpse into the raw potential of quantum computing. It’s also worth remembering that the field is still in its infancy. It took 80 years to go from ENIAC to El Capitan. The first quantum computer was built just 27 years ago, in 1998.

But just as classical computing required the invention of the microprocessor to scale, quantum computing will need its own foundational leap—an innovation that enables practical, on-chip integration. Today, companies like SEEQC are building that quantum equivalent: integrating quantum processors and classical control circuitry onto a single chip to enable the kind of system-wide scalability that could truly usher in the quantum age.

Speed is only the beginning. As we’ll see, quantum computers have the potential to revolutionize entire industries by solving problems that are currently intractable for classical machines. Thanks to the principles of quantum mechanics, these systems can process vast amounts of data in parallel and explore multiple possibilities at once. This allows quantum computers to tackle complex, multivariable challenges in ways that classical systems simply cannot—especially when dealing with uncertainty, optimization, and large-scale interactions.

As Seth Lloyd, a professor of mechanical engineering at MIT, put it:

ENIAC and El Capitan: What 80 Years of Innovation Looks Like In a paper published in Nature on December 9, 2024, Google Quantum AI introduced Willow, a 105-qubit superconducting quantum processor. Willow successfully completed a random circuit sampling (RCS) benchmark—a test designed to measure a quantum computer’s ability to process highly complex, non-deterministic quantum circuits—in under five minutes. By contrast, El Capitan and other leading classical supercomputers would require an estimated 10 septillion years (10²⁵) to complete the same task. That’s approximately 7.25 quadrillion times longer than the age of the universe.

While this comparison applies to a specific class of problems, it offers a glimpse into the raw potential of quantum computing. It’s also worth remembering that the field is still in its infancy. It took 80 years to go from ENIAC to El Capitan. The first quantum computer was built just 27 years ago, in 1998.

But just as classical computing required the invention of the microprocessor to scale, quantum computing will need its own foundational leap—an innovation that enables practical, on-chip integration. Today, companies like SEEQC are building that quantum equivalent: integrating quantum processors and classical control circuitry onto a single chip to enable the kind of system-wide scalability that could truly usher in the quantum age.

Speed is only the beginning. As we’ll see, quantum computers have the potential to revolutionize entire industries by solving problems that are currently intractable for classical machines. Thanks to the principles of quantum mechanics, these systems can process vast amounts of data in parallel and explore multiple possibilities at once. This allows quantum computers to tackle complex, multivariable challenges in ways that classical systems simply cannot—especially when dealing with uncertainty, optimization, and large-scale interactions.

As Seth Lloyd, a professor of mechanical engineering at MIT, put it:

ENIAC and El Capitan: What 80 Years of Innovation Looks Like In a paper published in Nature on December 9, 2024, Google Quantum AI introduced Willow, a 105-qubit superconducting quantum processor. Willow successfully completed a random circuit sampling (RCS) benchmark—a test designed to measure a quantum computer’s ability to process highly complex, non-deterministic quantum circuits—in under five minutes. By contrast, El Capitan and other leading classical supercomputers would require an estimated 10 septillion years (10²⁵) to complete the same task. That’s approximately 7.25 quadrillion times longer than the age of the universe.

While this comparison applies to a specific class of problems, it offers a glimpse into the raw potential of quantum computing. It’s also worth remembering that the field is still in its infancy. It took 80 years to go from ENIAC to El Capitan. The first quantum computer was built just 27 years ago, in 1998.

But just as classical computing required the invention of the microprocessor to scale, quantum computing will need its own foundational leap—an innovation that enables practical, on-chip integration. Today, companies like SEEQC are building that quantum equivalent: integrating quantum processors and classical control circuitry onto a single chip to enable the kind of system-wide scalability that could truly usher in the quantum age.

Speed is only the beginning. As we’ll see, quantum computers have the potential to revolutionize entire industries by solving problems that are currently intractable for classical machines. Thanks to the principles of quantum mechanics, these systems can process vast amounts of data in parallel and explore multiple possibilities at once. This allows quantum computers to tackle complex, multivariable challenges in ways that classical systems simply cannot—especially when dealing with uncertainty, optimization, and large-scale interactions.

As Seth Lloyd, a professor of mechanical engineering at MIT, put it:

“A classical computation is like a solo voice—one line of pure tones succeeding each other. A quantum computation is like a symphony—many lines of tones interfering with one another.”

“A classical computation is like a solo voice—one line of pure tones succeeding each other. A quantum computation is like a symphony—many lines of tones interfering with one another.”

“A classical computation is like a solo voice—one line of pure tones succeeding each other. A quantum computation is like a symphony—many lines of tones interfering with one another.”

In the years ahead, this symphony will harness uncertainty and complexity to unlock breakthroughs poised to transform the world.

In the years ahead, this symphony will harness uncertainty and complexity to unlock breakthroughs poised to transform the world.

In the years ahead, this symphony will harness uncertainty and complexity to unlock breakthroughs poised to transform the world.

The Stakes of the Quantum Race

The Stakes of the Quantum Race

The Stakes of the Quantum Race

Who comes out ahead in the quantum race could dramatically reshape the global balance of power—a reality well understood by both allies and adversaries. Like most technologies, quantum computing is morally neutral. Its impact on the world will be decided by how we use it.

In the right hands, quantum computers could accelerate progress on some of humanity’s greatest challenges. They could help cure diseases like cancer and Alzheimer’s, combat climate change by optimizing energy systems, and address food insecurity by simulating crop behavior, improving farming methods, and streamlining global food distribution. The potential for good is enormous.

In the wrong hands, quantum technology could entrench authoritarian control on an unprecedented scale. A regime such as China—or worse, Russia—leading in quantum computing could gain unchecked access to global information systems, break even the most advanced encryption, and manipulate critical infrastructure with impunity. Militarized quantum capabilities could enable next-generation weapons, bolster surveillance states, and erode any meaningful resistance to their influence. The consequences would go beyond geopolitical realignment—they could institutionalize fear, suppress freedom, and harden the foundations of coercive power worldwide.

Who comes out ahead in the quantum race could dramatically reshape the global balance of power—a reality well understood by both allies and adversaries. Like most technologies, quantum computing is morally neutral. Its impact on the world will be decided by how we use it.

In the right hands, quantum computers could accelerate progress on some of humanity’s greatest challenges. They could help cure diseases like cancer and Alzheimer’s, combat climate change by optimizing energy systems, and address food insecurity by simulating crop behavior, improving farming methods, and streamlining global food distribution. The potential for good is enormous.

In the wrong hands, quantum technology could entrench authoritarian control on an unprecedented scale. A regime such as China—or worse, Russia—leading in quantum computing could gain unchecked access to global information systems, break even the most advanced encryption, and manipulate critical infrastructure with impunity. Militarized quantum capabilities could enable next-generation weapons, bolster surveillance states, and erode any meaningful resistance to their influence. The consequences would go beyond geopolitical realignment—they could institutionalize fear, suppress freedom, and harden the foundations of coercive power worldwide.

Who comes out ahead in the quantum race could dramatically reshape the global balance of power—a reality well understood by both allies and adversaries. Like most technologies, quantum computing is morally neutral. Its impact on the world will be decided by how we use it.

In the right hands, quantum computers could accelerate progress on some of humanity’s greatest challenges. They could help cure diseases like cancer and Alzheimer’s, combat climate change by optimizing energy systems, and address food insecurity by simulating crop behavior, improving farming methods, and streamlining global food distribution. The potential for good is enormous.

In the wrong hands, quantum technology could entrench authoritarian control on an unprecedented scale. A regime such as China—or worse, Russia—leading in quantum computing could gain unchecked access to global information systems, break even the most advanced encryption, and manipulate critical infrastructure with impunity. Militarized quantum capabilities could enable next-generation weapons, bolster surveillance states, and erode any meaningful resistance to their influence. The consequences would go beyond geopolitical realignment—they could institutionalize fear, suppress freedom, and harden the foundations of coercive power worldwide.

The implications of the quantum race are far greater than those of the space race. For all its complexities, the U.S. and the democratic world represent the promise of a better future for humanity. The question is whether we are willing to take the bold leadership necessary to emerge victorious, as we have in the past. The stakes are high, and the path we choose will ultimately determine the kind of world we create.

The implications of the quantum race are far greater than those of the space race. For all its complexities, the U.S. and the democratic world represent the promise of a better future for humanity. The question is whether we are willing to take the bold leadership necessary to emerge victorious, as we have in the past. The stakes are high, and the path we choose will ultimately determine the kind of world we create.

The implications of the quantum race are far greater than those of the space race. For all its complexities, the U.S. and the democratic world represent the promise of a better future for humanity. The question is whether we are willing to take the bold leadership necessary to emerge victorious, as we have in the past. The stakes are high, and the path we choose will ultimately determine the kind of world we create.

In his 1962 address at Rice University on the nation’s space effort, John F. Kennedy remarked:

In his 1962 address at Rice University on the nation’s space effort, John F. Kennedy remarked:

In his 1962 address at Rice University on the nation’s space effort, John F. Kennedy remarked:

Those who came before us made certain that this country rode the first waves of the industrial revolutions, the first waves of modern invention, and the first wave of nuclear power, and this generation does not intend to founder in the backwash of the coming age of space. We mean to be a part of it—we mean to lead it. For the eyes of the world now look into space, to the moon and to the planets beyond, and we have vowed that we shall not see it governed by a hostile flag of conquest, but by a banner of freedom and peace. We have vowed that we shall not see space filled with weapons of mass destruction, but with instruments of knowledge and understanding.

Those who came before us made certain that this country rode the first waves of the industrial revolutions, the first waves of modern invention, and the first wave of nuclear power, and this generation does not intend to founder in the backwash of the coming age of space. We mean to be a part of it—we mean to lead it. For the eyes of the world now look into space, to the moon and to the planets beyond, and we have vowed that we shall not see it governed by a hostile flag of conquest, but by a banner of freedom and peace. We have vowed that we shall not see space filled with weapons of mass destruction, but with instruments of knowledge and understanding.

Those who came before us made certain that this country rode the first waves of the industrial revolutions, the first waves of modern invention, and the first wave of nuclear power, and this generation does not intend to founder in the backwash of the coming age of space. We mean to be a part of it—we mean to lead it. For the eyes of the world now look into space, to the moon and to the planets beyond, and we have vowed that we shall not see it governed by a hostile flag of conquest, but by a banner of freedom and peace. We have vowed that we shall not see space filled with weapons of mass destruction, but with instruments of knowledge and understanding.

Quantum Computing:
The Basics

Quantum Computing:
The Basics

Quantum Computing:
The Basics

As Richard Feynman once remarked, “If you think you understand quantum mechanics, then you don’t understand quantum mechanics.” The more that one delves into the quantum world, the less sense it starts to make.

For example, simply observing something shouldn’t change what it is. But in a phenomenon known as wave-function collapse, a particle can exist in multiple states at once—only “choosing” a single state the moment we measure it. In quantum superposition, objects can appear to be in two places at the same time, challenging our intuitive sense of space and time. There’s quantum entanglement, where two particles be come so deeply connected that the state of one instantly affects the state of the other, even if they are light years apart. And then there is interference, where the probabilities of different quantum states can combine in ways that either amplify or cancel each other out, creating outcomes that don’t follow the rules of classical physics. These strange behaviors mean quantum states can influence one another in unpredictable ways, making it impossible to know results with the certainty we’re accustomed to in the everyday world.

As Richard Feynman once remarked, “If you think you understand quantum mechanics, then you don’t understand quantum mechanics.” The more that one delves into the quantum world, the less sense it starts to make.

For example, simply observing something shouldn’t change what it is. But in a phenomenon known as wave-function collapse, a particle can exist in multiple states at once—only “choosing” a single state the moment we measure it. In quantum superposition, objects can appear to be in two places at the same time, challenging our intuitive sense of space and time. There’s quantum entanglement, where two particles be come so deeply connected that the state of one instantly affects the state of the other, even if they are light years apart. And then there is interference, where the probabilities of different quantum states can combine in ways that either amplify or cancel each other out, creating outcomes that don’t follow the rules of classical physics. These strange behaviors mean quantum states can influence one another in unpredictable ways, making it impossible to know results with the certainty we’re accustomed to in the everyday world.

As Richard Feynman once remarked, “If you think you understand quantum mechanics, then you don’t understand quantum mechanics.” The more that one delves into the quantum world, the less sense it starts to make.

For example, simply observing something shouldn’t change what it is. But in a phenomenon known as wave-function collapse, a particle can exist in multiple states at once—only “choosing” a single state the moment we measure it. In quantum superposition, objects can appear to be in two places at the same time, challenging our intuitive sense of space and time. There’s quantum entanglement, where two particles be come so deeply connected that the state of one instantly affects the state of the other, even if they are light years apart. And then there is interference, where the probabilities of different quantum states can combine in ways that either amplify or cancel each other out, creating outcomes that don’t follow the rules of classical physics. These strange behaviors mean quantum states can influence one another in unpredictable ways, making it impossible to know results with the certainty we’re accustomed to in the everyday world.

Figure 3: The fundamental quantum phenomena that enable quantum computing’s extraordinary capabilities.

Figure 3: The fundamental quantum phenomena that enable quantum computing’s extraordinary capabilities.

Figure 3: The fundamental quantum phenomena that enable quantum computing’s extraordinary capabilities.

Like the rest of the quantum world, quantum computing is built on principles that seem strange when compared to classical computing. Yet it’s precisely these principles that give quantum computers their extraordinary potential. Below are some of the core concepts that make this technology possible:

Like the rest of the quantum world, quantum computing is built on principles that seem strange when compared to classical computing. Yet it’s precisely these principles that give quantum computers their extraordinary potential. Below are some of the core concepts that make this technology possible:

Like the rest of the quantum world, quantum computing is built on principles that seem strange when compared to classical computing. Yet it’s precisely these principles that give quantum computers their extraordinary potential. Below are some of the core concepts that make this technology possible:

Superposition

Superposition

Superposition

In classical computing, information is stored as bits that areeither 0 or 1. In quantum computing, the basic unit of information is a qubit, which can exist as both 0 and 1 at the same time—a phenomenon known as superposition. A helpful way to visualize this is as a spinning coin that hasn’t yet landed: while in motion, it represents both heads and tails. Similarly, a qubit in superposition holds multiple possibilities at once, allowing quantum computers to explore many computational paths simultaneously.

In classical computing, information is stored as bits that areeither 0 or 1. In quantum computing, the basic unit of information is a qubit, which can exist as both 0 and 1 at the same time—a phenomenon known as superposition. A helpful way to visualize this is as a spinning coin that hasn’t yet landed: while in motion, it represents both heads and tails. Similarly, a qubit in superposition holds multiple possibilities at once, allowing quantum computers to explore many computational paths simultaneously.

In classical computing, information is stored as bits that areeither 0 or 1. In quantum computing, the basic unit of information is a qubit, which can exist as both 0 and 1 at the same time—a phenomenon known as superposition. A helpful way to visualize this is as a spinning coin that hasn’t yet landed: while in motion, it represents both heads and tails. Similarly, a qubit in superposition holds multiple possibilities at once, allowing quantum computers to explore many computational paths simultaneously.

Entanglement

Entanglement

Entanglement

One of the most powerful—and puzzling—features of quantum computing is entanglement. When qubits become entangled, they no longer act independently. Instead, they function as part of a single, unified system, with changes to one instantly influencing the other, even across vast distances. Entangled qubits can jointly represent a wide range of possibilities. For example, two entangled qubits can encode four states simultaneously; three can represent eight; and in general, n qubits can represent 2 states. It’s not just the number of qubits that matters—it’s their connectedness. Without entanglement, quantum systems lose their exponential advantage.

One of the most powerful—and puzzling—features of quantum computing is entanglement. When qubits become entangled, they no longer act independently. Instead, they function as part of a single, unified system, with changes to one instantly influencing the other, even across vast distances. Entangled qubits can jointly represent a wide range of possibilities. For example, two entangled qubits can encode four states simultaneously; three can represent eight; and in general, n qubits can represent 2 states. It’s not just the number of qubits that matters—it’s their connectedness. Without entanglement, quantum systems lose their exponential advantage.

One of the most powerful—and puzzling—features of quantum computing is entanglement. When qubits become entangled, they no longer act independently. Instead, they function as part of a single, unified system, with changes to one instantly influencing the other, even across vast distances. Entangled qubits can jointly represent a wide range of possibilities. For example, two entangled qubits can encode four states simultaneously; three can represent eight; and in general, n qubits can represent 2 states. It’s not just the number of qubits that matters—it’s their connectedness. Without entanglement, quantum systems lose their exponential advantage.

Interference

Interference

Interference

Quantum computers also use interference to guide calculations toward the correct answer. In simple terms, interference allows algorithms to boost the probability of correct outcomes while suppressing the wrong ones—similar to how waves can either amplify or cancel each other out. This relies on the structure provided by superposition and entanglement. Consider a deck with three cards, one of which is a queen. A classical computer would check each card in sequence. A quantum computer, however, can “peek” at all three simultaneously because they exist in an entangled superposition. Through carefully designed interference patterns, the quantum system amplifies the probability of revealing the queen and reduces the chance of selecting the wrong card. Once the interference pattern is applied, the system collapses into a single result—ideally the correct one.

Quantum computers also use interference to guide calculations toward the correct answer. In simple terms, interference allows algorithms to boost the probability of correct outcomes while suppressing the wrong ones—similar to how waves can either amplify or cancel each other out. This relies on the structure provided by superposition and entanglement. Consider a deck with three cards, one of which is a queen. A classical computer would check each card in sequence. A quantum computer, however, can “peek” at all three simultaneously because they exist in an entangled superposition. Through carefully designed interference patterns, the quantum system amplifies the probability of revealing the queen and reduces the chance of selecting the wrong card. Once the interference pattern is applied, the system collapses into a single result—ideally the correct one.

Quantum computers also use interference to guide calculations toward the correct answer. In simple terms, interference allows algorithms to boost the probability of correct outcomes while suppressing the wrong ones—similar to how waves can either amplify or cancel each other out. This relies on the structure provided by superposition and entanglement. Consider a deck with three cards, one of which is a queen. A classical computer would check each card in sequence. A quantum computer, however, can “peek” at all three simultaneously because they exist in an entangled superposition. Through carefully designed interference patterns, the quantum system amplifies the probability of revealing the queen and reduces the chance of selecting the wrong card. Once the interference pattern is applied, the system collapses into a single result—ideally the correct one.

Decoherence

Decoherence

Decoherence

Quantum systems are extremely sensitive. Decoherence occurs when a qubit interacts with its environment, causing it to lose its quantum state. This collapse is similar to a spinning coin being knocked over—it settles into a single outcome. Once decoherence occurs, the qubit behaves classically, and the advantages of superposition and entanglement are lost. To prevent this, quantum computers must operate in highly controlled environments—often at temperatures near absolute zero—to isolate qubits from external noise and preserve their fragile quantum states.

Quantum systems are extremely sensitive. Decoherence occurs when a qubit interacts with its environment, causing it to lose its quantum state. This collapse is similar to a spinning coin being knocked over—it settles into a single outcome. Once decoherence occurs, the qubit behaves classically, and the advantages of superposition and entanglement are lost. To prevent this, quantum computers must operate in highly controlled environments—often at temperatures near absolute zero—to isolate qubits from external noise and preserve their fragile quantum states.

Quantum systems are extremely sensitive. Decoherence occurs when a qubit interacts with its environment, causing it to lose its quantum state. This collapse is similar to a spinning coin being knocked over—it settles into a single outcome. Once decoherence occurs, the qubit behaves classically, and the advantages of superposition and entanglement are lost. To prevent this, quantum computers must operate in highly controlled environments—often at temperatures near absolute zero—to isolate qubits from external noise and preserve their fragile quantum states.

Probabilistic vs. Deterministic Computation

Probabilistic vs. Deterministic Computation

Probabilistic vs. Deterministic Computation

Classical computers are deterministic: given the same input, they will always produce the same output. Quantum computers, in contrast, are probabilistic. They explore a range of possible outcomes and return the most likely result based on the structure of the problem and the interference applied. While this may sound uncertain, well-designed quantum algorithms can drive these probabilities toward extremely high confidence, especially for problems that are computationally intractable for classical machines.

Classical computers are deterministic: given the same input, they will always produce the same output. Quantum computers, in contrast, are probabilistic. They explore a range of possible outcomes and return the most likely result based on the structure of the problem and the interference applied. While this may sound uncertain, well-designed quantum algorithms can drive these probabilities toward extremely high confidence, especially for problems that are computationally intractable for classical machines.

Classical computers are deterministic: given the same input, they will always produce the same output. Quantum computers, in contrast, are probabilistic. They explore a range of possible outcomes and return the most likely result based on the structure of the problem and the interference applied. While this may sound uncertain, well-designed quantum algorithms can drive these probabilities toward extremely high confidence, especially for problems that are computationally intractable for classical machines.

Exponential vs. Linear Scaling

Exponential vs. Linear Scaling

Exponential vs. Linear Scaling

Classical computing improves linearly—more processors yield steady gains. Quantum computing, by contrast, offers exponential potential: each additional qubit doubles the size of the computational space, unlocking the possibility of solving problems far beyond the reach of classical machines. But scale alone isn’t enough. Real-world speedups depend on more than qubit count—they require high-fidelity operations, sufficient circuit depth, and algorithms tailored to harness quantum effects. A 1,000-qubit device that decoheres in microseconds solves nothing. Still, as hardware and error correction improve, we are approaching the threshold where quantum systems can achieve what no classical computer ever could. Microsoft CEO Satya Nadella captured this shift when he remarked, “Quantum computing is not only faster than conventional computing, but its workload obeys a different scaling law—rendering Moore’s Law little more than a quantum memory.” His point reflects a fundamental truth of quantum systems: rather than cramming more transistors onto a chip, as Moore’s Law suggests, quantum computers increase their power by leveraging superposition and entanglement to expand their computational capacity exponentially. Just one more qubit means twice the computational reach—a leap that redefines the boundaries of what’s possible.

Classical computing improves linearly—more processors yield steady gains. Quantum computing, by contrast, offers exponential potential: each additional qubit doubles the size of the computational space, unlocking the possibility of solving problems far beyond the reach of classical machines. But scale alone isn’t enough. Real-world speedups depend on more than qubit count—they require high-fidelity operations, sufficient circuit depth, and algorithms tailored to harness quantum effects. A 1,000-qubit device that decoheres in microseconds solves nothing. Still, as hardware and error correction improve, we are approaching the threshold where quantum systems can achieve what no classical computer ever could. Microsoft CEO Satya Nadella captured this shift when he remarked, “Quantum computing is not only faster than conventional computing, but its workload obeys a different scaling law—rendering Moore’s Law little more than a quantum memory.” His point reflects a fundamental truth of quantum systems: rather than cramming more transistors onto a chip, as Moore’s Law suggests, quantum computers increase their power by leveraging superposition and entanglement to expand their computational capacity exponentially. Just one more qubit means twice the computational reach—a leap that redefines the boundaries of what’s possible.

Classical computing improves linearly—more processors yield steady gains. Quantum computing, by contrast, offers exponential potential: each additional qubit doubles the size of the computational space, unlocking the possibility of solving problems far beyond the reach of classical machines. But scale alone isn’t enough. Real-world speedups depend on more than qubit count—they require high-fidelity operations, sufficient circuit depth, and algorithms tailored to harness quantum effects. A 1,000-qubit device that decoheres in microseconds solves nothing. Still, as hardware and error correction improve, we are approaching the threshold where quantum systems can achieve what no classical computer ever could. Microsoft CEO Satya Nadella captured this shift when he remarked, “Quantum computing is not only faster than conventional computing, but its workload obeys a different scaling law—rendering Moore’s Law little more than a quantum memory.” His point reflects a fundamental truth of quantum systems: rather than cramming more transistors onto a chip, as Moore’s Law suggests, quantum computers increase their power by leveraging superposition and entanglement to expand their computational capacity exponentially. Just one more qubit means twice the computational reach—a leap that redefines the boundaries of what’s possible.

Figure 2: While classical computers scale linearly with additional processing units, quantum computers scale exponentially with each additional qubit.

Figure 2: While classical computers scale linearly with additional processing units, quantum computers scale exponentially with each additional qubit.

Figure 2: While classical computers scale linearly with additional processing units, quantum computers scale exponentially with each additional qubit.

These principles form the backbone of quantum computing. By harnessing phenomena like superposition, entanglement, interference, and exponential scaling, quantum computers approach problems in ways that are fundamentally unlike anything classical machines can achieve. The result isn’t just more speed—it’s a new computational paradigm. In the next section, we’ll explore what this shift means for the world—and how it could transform industries, economies, and the future of global leadership.

These principles form the backbone of quantum computing. By harnessing phenomena like superposition, entanglement, interference, and exponential scaling, quantum computers approach problems in ways that are fundamentally unlike anything classical machines can achieve. The result isn’t just more speed—it’s a new computational paradigm. In the next section, we’ll explore what this shift means for the world—and how it could transform industries, economies, and the future of global leadership.

These principles form the backbone of quantum computing. By harnessing phenomena like superposition, entanglement, interference, and exponential scaling, quantum computers approach problems in ways that are fundamentally unlike anything classical machines can achieve. The result isn’t just more speed—it’s a new computational paradigm. In the next section, we’ll explore what this shift means for the world—and how it could transform industries, economies, and the future of global leadership.

What Quantum Computers are Capable of

What Quantum Computers are Capable of

What Quantum Computers are Capable of

Quantum computers will have a profound impact across a range of fields, offering new possibilities for solving problems that are currently beyond the reach of classical computing. This section explores some of the most exciting and transformative applications of the technology.

Quantum computers will have a profound impact across a range of fields, offering new possibilities for solving problems that are currently beyond the reach of classical computing. This section explores some of the most exciting and transformative applications of the technology.

Quantum computers will have a profound impact across a range of fields, offering new possibilities for solving problems that are currently beyond the reach of classical computing. This section explores some of the most exciting and transformative applications of the technology.

Figure 4: Near-term quantum applications promise high transformative potential across multiple industries.

Figure 4: Near-term quantum applications promise high transformative potential across multiple industries.

Figure 4: Near-term quantum applications promise high transformative potential across multiple industries.

Molecular and Chemical Simulation

Molecular and Chemical Simulation

Molecular and Chemical Simulation

Classical computing has driven remarkable progress across nearly every scientific discipline, and chemistry is no exception. But even our most powerful supercomputers fall short when it comes to simulating the quantum behavior of electrons—something essential for breakthroughs in drug discovery, materials science, and clean energy.

Most chemical R\&D today relies on Computer-Aided Drug Design (CADD) and high-performance computing (HPC)—both of which are powerful but fundamentally limited. These approaches are iterative, time-consuming, and heavily dependent on trial and error. Researchers must often rely on crude approximations to model electron interactions, and even modest improvements in accuracy can require immense computational resources. Despite advances in AI-assisted methods, key quantum behaviors often remain beyond the reach of classical models.

These limitations have created a significant innovation bottleneck, particularly in the pharmaceutical industry, where the return on investment for developing new drugs often no longer justifies the immense R\&D costs. As a result, many companies have shifted focus to repurposing existing compounds for new applications rather than pursuing fundamentally new treatments. This bottleneck is more than an economic issue. It is also preventing much-needed progress on treatments for diseases on like cancer, Alzheimer’s, and antibiotic-resistant infections. New breakthroughs in these areas have the potential to save millions of lives each year.

Quantum computing offers a fundamentally different path than CADD and HPC. Instead of approximating electron behavior through classical shortcuts, quantum computers use qubits to model key quantum properties —such as spin, energy levels, and orbital interactions—directly. While quantum systems still simulate electrons rather than replicate them, they do so in a way that captures the entanglements and correlations that define quantum chemistry. In short, quantum computers “speak” the same mathematical language as the systems they model.

This unlocks the possibility of simulating entire molecules and chemical reactions with a level of precision classical machines simply can’t achieve. In drug discovery, that could mean testing thousands of drug compounds virtually before entering a lab, dramatically reducing the time, cost, and uncertainty associated with pharmaceutical R&D.

But the implications extend far beyond medicine. Quantum simulation could lead to the creation of ultra-efficient batteries, cleaner industrial catalysts, novel materials for energy storage, and scalable solutions for carbon capture. In this sense, quantum computing isn’t just a new tool—it’s a strategic technology with the potential to reshape entire industries.

Companies like Merck KGaA have recognized this. Faced with growing global competition, especially from countries with greater wet-lab capacity, Merck has emphasized the importance of digitization—and quantum computing in particular—to maintain a competitive edge. Through projects like BAIQO and partnerships focused on quantum chemistry, the company has positioned quantum technologies as essential to sustaining innovation and strategic leadership in the life sciences.

Classical computing has driven remarkable progress across nearly every scientific discipline, and chemistry is no exception. But even our most powerful supercomputers fall short when it comes to simulating the quantum behavior of electrons—something essential for breakthroughs in drug discovery, materials science, and clean energy.

Most chemical R\&D today relies on Computer-Aided Drug Design (CADD) and high-performance computing (HPC)—both of which are powerful but fundamentally limited. These approaches are iterative, time-consuming, and heavily dependent on trial and error. Researchers must often rely on crude approximations to model electron interactions, and even modest improvements in accuracy can require immense computational resources. Despite advances in AI-assisted methods, key quantum behaviors often remain beyond the reach of classical models.

These limitations have created a significant innovation bottleneck, particularly in the pharmaceutical industry, where the return on investment for developing new drugs often no longer justifies the immense R\&D costs. As a result, many companies have shifted focus to repurposing existing compounds for new applications rather than pursuing fundamentally new treatments. This bottleneck is more than an economic issue. It is also preventing much-needed progress on treatments for diseases on like cancer, Alzheimer’s, and antibiotic-resistant infections. New breakthroughs in these areas have the potential to save millions of lives each year.

Quantum computing offers a fundamentally different path than CADD and HPC. Instead of approximating electron behavior through classical shortcuts, quantum computers use qubits to model key quantum properties —such as spin, energy levels, and orbital interactions—directly. While quantum systems still simulate electrons rather than replicate them, they do so in a way that captures the entanglements and correlations that define quantum chemistry. In short, quantum computers “speak” the same mathematical language as the systems they model.

This unlocks the possibility of simulating entire molecules and chemical reactions with a level of precision classical machines simply can’t achieve. In drug discovery, that could mean testing thousands of drug compounds virtually before entering a lab, dramatically reducing the time, cost, and uncertainty associated with pharmaceutical R&D.

But the implications extend far beyond medicine. Quantum simulation could lead to the creation of ultra-efficient batteries, cleaner industrial catalysts, novel materials for energy storage, and scalable solutions for carbon capture. In this sense, quantum computing isn’t just a new tool—it’s a strategic technology with the potential to reshape entire industries.

Companies like Merck KGaA have recognized this. Faced with growing global competition, especially from countries with greater wet-lab capacity, Merck has emphasized the importance of digitization—and quantum computing in particular—to maintain a competitive edge. Through projects like BAIQO and partnerships focused on quantum chemistry, the company has positioned quantum technologies as essential to sustaining innovation and strategic leadership in the life sciences.

Classical computing has driven remarkable progress across nearly every scientific discipline, and chemistry is no exception. But even our most powerful supercomputers fall short when it comes to simulating the quantum behavior of electrons—something essential for breakthroughs in drug discovery, materials science, and clean energy.

Most chemical R\&D today relies on Computer-Aided Drug Design (CADD) and high-performance computing (HPC)—both of which are powerful but fundamentally limited. These approaches are iterative, time-consuming, and heavily dependent on trial and error. Researchers must often rely on crude approximations to model electron interactions, and even modest improvements in accuracy can require immense computational resources. Despite advances in AI-assisted methods, key quantum behaviors often remain beyond the reach of classical models.

These limitations have created a significant innovation bottleneck, particularly in the pharmaceutical industry, where the return on investment for developing new drugs often no longer justifies the immense R\&D costs. As a result, many companies have shifted focus to repurposing existing compounds for new applications rather than pursuing fundamentally new treatments. This bottleneck is more than an economic issue. It is also preventing much-needed progress on treatments for diseases on like cancer, Alzheimer’s, and antibiotic-resistant infections. New breakthroughs in these areas have the potential to save millions of lives each year.

Quantum computing offers a fundamentally different path than CADD and HPC. Instead of approximating electron behavior through classical shortcuts, quantum computers use qubits to model key quantum properties —such as spin, energy levels, and orbital interactions—directly. While quantum systems still simulate electrons rather than replicate them, they do so in a way that captures the entanglements and correlations that define quantum chemistry. In short, quantum computers “speak” the same mathematical language as the systems they model.

This unlocks the possibility of simulating entire molecules and chemical reactions with a level of precision classical machines simply can’t achieve. In drug discovery, that could mean testing thousands of drug compounds virtually before entering a lab, dramatically reducing the time, cost, and uncertainty associated with pharmaceutical R&D.

But the implications extend far beyond medicine. Quantum simulation could lead to the creation of ultra-efficient batteries, cleaner industrial catalysts, novel materials for energy storage, and scalable solutions for carbon capture. In this sense, quantum computing isn’t just a new tool—it’s a strategic technology with the potential to reshape entire industries.

Companies like Merck KGaA have recognized this. Faced with growing global competition, especially from countries with greater wet-lab capacity, Merck has emphasized the importance of digitization—and quantum computing in particular—to maintain a competitive edge. Through projects like BAIQO and partnerships focused on quantum chemistry, the company has positioned quantum technologies as essential to sustaining innovation and strategic leadership in the life sciences.

Securing Communications
& Post-Quantum Cryptography

Securing Communications
& Post-Quantum Cryptography

Securing Communications
& Post-Quantum Cryptography

In March 2025, [WIRED](https://www.wired.com/story/q-day-apocalypse-quantum-computers-encryption/) featured a story, “The Quantum Apocalypse Is Coming. Be Very Afraid.” Though a bit melodramatic, the title refers to a very real concern over what cybersecurity analysts call “Q-Day”—the day someone builds a quantum computer that can crack the most widely used forms of encryption. According to the article, all of humanity’s most sensitive data–from personal emails and text messages to crucial national security documents–would suddenly become vulnerable.

In 1994, Massachusetts Institute of Technology (MIT) scientist [Peter Shor](https://www.cfr.org/backgrounder/what-quantum-computing) designed an algorithm that demonstrated that quantum techniques could crack classical encryption methods used to secure state secrets far faster than a conventional computer. To understand why this is such a concern, it helps to first go over the basics of encryption.

The most commonly used encryption method is RSA, which secures data by making it computationally difficult to decode without the proper key. The process begins by selecting two large prime numbers, typically thousands of digits long. Once these numbers are selected, they are multiplied together to form a number (*n*) which is part of the public key. This public key is shared openly, allowing others to use it to encrypt messages.

The two original prime numbers (let's call them p and q) remain secret. From these primes, the private key is derived, and it is the only key capable of decrypting messages that were encrypted with the public key. In this system, the security relies on the fact that, while the public key (n) is available to everyone, [factoring](https://quantum-computing.ibm.com/docs/learn/gate-model/shors-algorithm) (*n*) back into (*p*) and (*q*) to recover the private key is extremely difficult for classical computers. Even for the most powerful supercomputers, it would take [thousands of years](https://www.technologyreview.com/2022/11/07/1062469/quantum-computers-crack-encryption/). Of course, as discussed in the introduction, for quantum computers, this could be done in a matter of days, potentially even hours.

Though quantum computing poses a problem for cryptography, it also offers a solution, as quantum technologies could both supercharge decryption and facilitate new forms of cryptography. Instead of mathematical principles, quantum cryptography relies on the unique principles of quantum mechanics to ensure that any attempt to observe or tamper with a communication irreversibly alters the data, making undetected eavesdropping virtually impossible.

One method is [quantum key distribution](https://www.fortinet.com/resources/cyberglossary/quantum-key-distribution) (QKD), which transmits [single photons](https://quantumxc.com/blog/how-does-quantum-key-distribution-work/)—each carrying a qubit of information—along a fiber optic cable. The sender uses [polarized filters](https://www.rp-photonics.com/quantum_key_distribution.html) to set the orientation of each photon, while the receiver uses beam splitters to measure it. After comparing the results, both parties extract a shared encryption key. Unlike classical encryption, any attempt to intercept the photons alters their state, alerting both sides.

Given the rapid pace of AI and other emerging technologies, Q-Day may arrive sooner than expected. While most experts estimate it will occur in the early 2030s, some believe it could happen much earlier. There is even the possibility that it has already happened, but due to the immense strategic advantage of keeping it a secret, has not been made known to the public. Regardless, preparing for a post-classical encryption world must be a top priority for governments worldwide.

In March 2025, [WIRED](https://www.wired.com/story/q-day-apocalypse-quantum-computers-encryption/) featured a story, “The Quantum Apocalypse Is Coming. Be Very Afraid.” Though a bit melodramatic, the title refers to a very real concern over what cybersecurity analysts call “Q-Day”—the day someone builds a quantum computer that can crack the most widely used forms of encryption. According to the article, all of humanity’s most sensitive data–from personal emails and text messages to crucial national security documents–would suddenly become vulnerable.

In 1994, Massachusetts Institute of Technology (MIT) scientist [Peter Shor](https://www.cfr.org/backgrounder/what-quantum-computing) designed an algorithm that demonstrated that quantum techniques could crack classical encryption methods used to secure state secrets far faster than a conventional computer. To understand why this is such a concern, it helps to first go over the basics of encryption.

The most commonly used encryption method is RSA, which secures data by making it computationally difficult to decode without the proper key. The process begins by selecting two large prime numbers, typically thousands of digits long. Once these numbers are selected, they are multiplied together to form a number (*n*) which is part of the public key. This public key is shared openly, allowing others to use it to encrypt messages.

The two original prime numbers (let's call them p and q) remain secret. From these primes, the private key is derived, and it is the only key capable of decrypting messages that were encrypted with the public key. In this system, the security relies on the fact that, while the public key (n) is available to everyone, [factoring](https://quantum-computing.ibm.com/docs/learn/gate-model/shors-algorithm) (*n*) back into (*p*) and (*q*) to recover the private key is extremely difficult for classical computers. Even for the most powerful supercomputers, it would take [thousands of years](https://www.technologyreview.com/2022/11/07/1062469/quantum-computers-crack-encryption/). Of course, as discussed in the introduction, for quantum computers, this could be done in a matter of days, potentially even hours.

Though quantum computing poses a problem for cryptography, it also offers a solution, as quantum technologies could both supercharge decryption and facilitate new forms of cryptography. Instead of mathematical principles, quantum cryptography relies on the unique principles of quantum mechanics to ensure that any attempt to observe or tamper with a communication irreversibly alters the data, making undetected eavesdropping virtually impossible.

One method is [quantum key distribution](https://www.fortinet.com/resources/cyberglossary/quantum-key-distribution) (QKD), which transmits [single photons](https://quantumxc.com/blog/how-does-quantum-key-distribution-work/)—each carrying a qubit of information—along a fiber optic cable. The sender uses [polarized filters](https://www.rp-photonics.com/quantum_key_distribution.html) to set the orientation of each photon, while the receiver uses beam splitters to measure it. After comparing the results, both parties extract a shared encryption key. Unlike classical encryption, any attempt to intercept the photons alters their state, alerting both sides.

Given the rapid pace of AI and other emerging technologies, Q-Day may arrive sooner than expected. While most experts estimate it will occur in the early 2030s, some believe it could happen much earlier. There is even the possibility that it has already happened, but due to the immense strategic advantage of keeping it a secret, has not been made known to the public. Regardless, preparing for a post-classical encryption world must be a top priority for governments worldwide.

In March 2025, [WIRED](https://www.wired.com/story/q-day-apocalypse-quantum-computers-encryption/) featured a story, “The Quantum Apocalypse Is Coming. Be Very Afraid.” Though a bit melodramatic, the title refers to a very real concern over what cybersecurity analysts call “Q-Day”—the day someone builds a quantum computer that can crack the most widely used forms of encryption. According to the article, all of humanity’s most sensitive data–from personal emails and text messages to crucial national security documents–would suddenly become vulnerable.

In 1994, Massachusetts Institute of Technology (MIT) scientist [Peter Shor](https://www.cfr.org/backgrounder/what-quantum-computing) designed an algorithm that demonstrated that quantum techniques could crack classical encryption methods used to secure state secrets far faster than a conventional computer. To understand why this is such a concern, it helps to first go over the basics of encryption.

The most commonly used encryption method is RSA, which secures data by making it computationally difficult to decode without the proper key. The process begins by selecting two large prime numbers, typically thousands of digits long. Once these numbers are selected, they are multiplied together to form a number (*n*) which is part of the public key. This public key is shared openly, allowing others to use it to encrypt messages.

The two original prime numbers (let's call them p and q) remain secret. From these primes, the private key is derived, and it is the only key capable of decrypting messages that were encrypted with the public key. In this system, the security relies on the fact that, while the public key (n) is available to everyone, [factoring](https://quantum-computing.ibm.com/docs/learn/gate-model/shors-algorithm) (*n*) back into (*p*) and (*q*) to recover the private key is extremely difficult for classical computers. Even for the most powerful supercomputers, it would take [thousands of years](https://www.technologyreview.com/2022/11/07/1062469/quantum-computers-crack-encryption/). Of course, as discussed in the introduction, for quantum computers, this could be done in a matter of days, potentially even hours.

Though quantum computing poses a problem for cryptography, it also offers a solution, as quantum technologies could both supercharge decryption and facilitate new forms of cryptography. Instead of mathematical principles, quantum cryptography relies on the unique principles of quantum mechanics to ensure that any attempt to observe or tamper with a communication irreversibly alters the data, making undetected eavesdropping virtually impossible.

One method is [quantum key distribution](https://www.fortinet.com/resources/cyberglossary/quantum-key-distribution) (QKD), which transmits [single photons](https://quantumxc.com/blog/how-does-quantum-key-distribution-work/)—each carrying a qubit of information—along a fiber optic cable. The sender uses [polarized filters](https://www.rp-photonics.com/quantum_key_distribution.html) to set the orientation of each photon, while the receiver uses beam splitters to measure it. After comparing the results, both parties extract a shared encryption key. Unlike classical encryption, any attempt to intercept the photons alters their state, alerting both sides.

Given the rapid pace of AI and other emerging technologies, Q-Day may arrive sooner than expected. While most experts estimate it will occur in the early 2030s, some believe it could happen much earlier. There is even the possibility that it has already happened, but due to the immense strategic advantage of keeping it a secret, has not been made known to the public. Regardless, preparing for a post-classical encryption world must be a top priority for governments worldwide.

Advancing Artificial Intelligence

Advancing Artificial Intelligence

Advancing Artificial Intelligence

As computer scientist and entrepreneur Andrew Ng argued, AI is poised to become “the new electricity.” Just as electricity transformed nearly every industry in the 20th century—driving breakthroughs in manufacturing, communication, healthcare, and transportation—AI promises to reshape nearly every aspect of modern life, perhaps even more profoundly. While much of the spotlight has been on AI, it is deeply intertwined with quantum computing. The two technologies. The two technologies act as force multipliers—each amplifying the capabilities of the other and unlocking possibilities that neither could achieve alone.

Quantum computing has the potential to dramatically accelerate machine learning by processing high-dimensional datasets more efficiently than classical systems. Current large language models (LLMs), for example, require over a million GPU hours to train. Quantum neural networks (QNNs) offer a path to far more efficient training by leveraging principles like superposition and entanglement, enabling numerous calculations to be performed simultaneously.

But realizing this potential doesn’t mean replacing classical systems—it means integrating them. Most real-world AI workflows will involve a hybrid model, where quantum and classical systems work in tandem. Classical computers are better suited to certain tasks, such as data preprocessing and decision logic, while quantum systems excel at complex pattern recognition, optimization, and probabilistic inference. This back-and-forth interaction, where data and workloads move between classical and quantum systems, makes the development of seamless interface technologies critical.

Leading companies like NVIDIA and quantum companies are already advancing this frontier, developing the tools and infrastructure to bridge quantum and classical computing. Our company, SEEQC, recently demonstrated a GPU’s first digital chip-chip connectivity to an integrated Quantum Processing Unit (QPU) for quantum AI.  These hybrid architectures will likely be essential to scaling AI safely and efficiently in the years ahead.

As computer scientist and entrepreneur Andrew Ng argued, AI is poised to become “the new electricity.” Just as electricity transformed nearly every industry in the 20th century—driving breakthroughs in manufacturing, communication, healthcare, and transportation—AI promises to reshape nearly every aspect of modern life, perhaps even more profoundly. While much of the spotlight has been on AI, it is deeply intertwined with quantum computing. The two technologies. The two technologies act as force multipliers—each amplifying the capabilities of the other and unlocking possibilities that neither could achieve alone.

Quantum computing has the potential to dramatically accelerate machine learning by processing high-dimensional datasets more efficiently than classical systems. Current large language models (LLMs), for example, require over a million GPU hours to train. Quantum neural networks (QNNs) offer a path to far more efficient training by leveraging principles like superposition and entanglement, enabling numerous calculations to be performed simultaneously.

But realizing this potential doesn’t mean replacing classical systems—it means integrating them. Most real-world AI workflows will involve a hybrid model, where quantum and classical systems work in tandem. Classical computers are better suited to certain tasks, such as data preprocessing and decision logic, while quantum systems excel at complex pattern recognition, optimization, and probabilistic inference. This back-and-forth interaction, where data and workloads move between classical and quantum systems, makes the development of seamless interface technologies critical.

Leading companies like NVIDIA and quantum companies are already advancing this frontier, developing the tools and infrastructure to bridge quantum and classical computing. Our company, SEEQC, recently demonstrated a GPU’s first digital chip-chip connectivity to an integrated Quantum Processing Unit (QPU) for quantum AI.  These hybrid architectures will likely be essential to scaling AI safely and efficiently in the years ahead.

As computer scientist and entrepreneur Andrew Ng argued, AI is poised to become “the new electricity.” Just as electricity transformed nearly every industry in the 20th century—driving breakthroughs in manufacturing, communication, healthcare, and transportation—AI promises to reshape nearly every aspect of modern life, perhaps even more profoundly. While much of the spotlight has been on AI, it is deeply intertwined with quantum computing. The two technologies. The two technologies act as force multipliers—each amplifying the capabilities of the other and unlocking possibilities that neither could achieve alone.

Quantum computing has the potential to dramatically accelerate machine learning by processing high-dimensional datasets more efficiently than classical systems. Current large language models (LLMs), for example, require over a million GPU hours to train. Quantum neural networks (QNNs) offer a path to far more efficient training by leveraging principles like superposition and entanglement, enabling numerous calculations to be performed simultaneously.

But realizing this potential doesn’t mean replacing classical systems—it means integrating them. Most real-world AI workflows will involve a hybrid model, where quantum and classical systems work in tandem. Classical computers are better suited to certain tasks, such as data preprocessing and decision logic, while quantum systems excel at complex pattern recognition, optimization, and probabilistic inference. This back-and-forth interaction, where data and workloads move between classical and quantum systems, makes the development of seamless interface technologies critical.

Leading companies like NVIDIA and quantum companies are already advancing this frontier, developing the tools and infrastructure to bridge quantum and classical computing. Our company, SEEQC, recently demonstrated a GPU’s first digital chip-chip connectivity to an integrated Quantum Processing Unit (QPU) for quantum AI.  These hybrid architectures will likely be essential to scaling AI safely and efficiently in the years ahead.

Just as quantum technologies can enhance the capabilities of AI, AI can, in turn, help accelerate progress in quantum computing. A recent study on quantum AI published by the Quantum Economic Development Consortium (QED-C) highlights several promising applications: AI can optimize the use of computational resources for real-time error correction—one of the most critical challenges in scaling quantum systems. It can also support more efficient qubit calibration and contribute to the design of next-generation quantum chips, ultimately improving system performance and reliability.

Just as quantum technologies can enhance the capabilities of AI, AI can, in turn, help accelerate progress in quantum computing. A recent study on quantum AI published by the Quantum Economic Development Consortium (QED-C) highlights several promising applications: AI can optimize the use of computational resources for real-time error correction—one of the most critical challenges in scaling quantum systems. It can also support more efficient qubit calibration and contribute to the design of next-generation quantum chips, ultimately improving system performance and reliability.

Just as quantum technologies can enhance the capabilities of AI, AI can, in turn, help accelerate progress in quantum computing. A recent study on quantum AI published by the Quantum Economic Development Consortium (QED-C) highlights several promising applications: AI can optimize the use of computational resources for real-time error correction—one of the most critical challenges in scaling quantum systems. It can also support more efficient qubit calibration and contribute to the design of next-generation quantum chips, ultimately improving system performance and reliability.

Optimizing the Energy Ecosystem

Optimizing the Energy Ecosystem

Optimizing the Energy Ecosystem

Finally, in what could be its most important near-term application, quantum computing has the potential to significantly impact energy consumption through quantum-enabled applications that optimize energy systems and reduce the enormous power requirements of the data centers themselves. According to a Goldman Sachs Research forecast, global power demand from data centers is expected to increase by 50% by 2027, with a projected 165% increase by the end of the decade compared to 2023 levels. Some large data centers already consume 100 MW, the equivalent of powering 80,000 homes for an hour.

Finally, in what could be its most important near-term application, quantum computing has the potential to significantly impact energy consumption through quantum-enabled applications that optimize energy systems and reduce the enormous power requirements of the data centers themselves. According to a Goldman Sachs Research forecast, global power demand from data centers is expected to increase by 50% by 2027, with a projected 165% increase by the end of the decade compared to 2023 levels. Some large data centers already consume 100 MW, the equivalent of powering 80,000 homes for an hour.

Finally, in what could be its most important near-term application, quantum computing has the potential to significantly impact energy consumption through quantum-enabled applications that optimize energy systems and reduce the enormous power requirements of the data centers themselves. According to a Goldman Sachs Research forecast, global power demand from data centers is expected to increase by 50% by 2027, with a projected 165% increase by the end of the decade compared to 2023 levels. Some large data centers already consume 100 MW, the equivalent of powering 80,000 homes for an hour.

Figure 5: Optimized quantum systems could dramatically reduce data center energy requirements compared to classical and AI infrastructure.

Figure 5: Optimized quantum systems could dramatically reduce data center energy requirements compared to classical and AI infrastructure.

Figure 5: Optimized quantum systems could dramatically reduce data center energy requirements compared to classical and AI infrastructure.

The rising energy demands for AI account for a notable portion of this increase. The IT infrastructure required to support AI workloads primarily involves user terminals (such as computers and smartphones) and, most importantly, data centers—of which there are over 8,000 worldwide. The U.S. is home to roughly 33%, Europe to 16%, and China to 10%, according to the International Energy Agency (IEA). If Google were to transform its search engine into something akin to ChatGPT, with nine billion chatbot interactions per day instead of regular search queries, the energy demand would surge to levels comparable to the entire nation of Ireland.

Such growth in AI-driven processes comes with considerable financial costs. If Google were to make this switch noted above, it would require a $100 billion investment to meet the power needs. There are currently [four tech](https://www.techtarget.com/whatis/feature/Three-tech-companies-eyeing-nuclear-power-for-AI-energy) companies–Microsoft, Oracle, Amazon, and Google–that have plans to build their own nuclear power plants. There are also considerable environmental costs: Each month, ChatGPT generates around 260,930 kilograms of carbon dioxide–the equivalent of 260 flights between New York and London.

Scaled quantum data centers, as presently proposed, present a similar energy-intensive profile compared to AI-oriented data centers. Yet in contrast to the brute force solution of adding massive energy infrastructure to power future quantum data centers, there are technical approaches in the hybrid quantum-classical domain that can reduce needed electricity from tens of megawatts per system down to tens of kilowatts for scaled quantum computers. For instance, a typical superconducting quantum computer uses approximately 2-5 watts of room temperature electronics to control a single qubit. A quantum system engineered for energy efficiency—using classical superconducting technologies like Single Flux Quantum (SFQ) circuits—can reduce power consumption to just 3 nanowatts per qubit, making it a billion times more efficient than conventional approaches. This scales throughout the quantum system up to the data center level and can help to significantly reduce the energy footprint of quantum data centers.

The rising energy demands for AI account for a notable portion of this increase. The IT infrastructure required to support AI workloads primarily involves user terminals (such as computers and smartphones) and, most importantly, data centers—of which there are over 8,000 worldwide. The U.S. is home to roughly 33%, Europe to 16%, and China to 10%, according to the International Energy Agency (IEA). If Google were to transform its search engine into something akin to ChatGPT, with nine billion chatbot interactions per day instead of regular search queries, the energy demand would surge to levels comparable to the entire nation of Ireland.

Such growth in AI-driven processes comes with considerable financial costs. If Google were to make this switch noted above, it would require a $100 billion investment to meet the power needs. There are currently [four tech](https://www.techtarget.com/whatis/feature/Three-tech-companies-eyeing-nuclear-power-for-AI-energy) companies–Microsoft, Oracle, Amazon, and Google–that have plans to build their own nuclear power plants. There are also considerable environmental costs: Each month, ChatGPT generates around 260,930 kilograms of carbon dioxide–the equivalent of 260 flights between New York and London.

Scaled quantum data centers, as presently proposed, present a similar energy-intensive profile compared to AI-oriented data centers. Yet in contrast to the brute force solution of adding massive energy infrastructure to power future quantum data centers, there are technical approaches in the hybrid quantum-classical domain that can reduce needed electricity from tens of megawatts per system down to tens of kilowatts for scaled quantum computers. For instance, a typical superconducting quantum computer uses approximately 2-5 watts of room temperature electronics to control a single qubit. A quantum system engineered for energy efficiency—using classical superconducting technologies like Single Flux Quantum (SFQ) circuits—can reduce power consumption to just 3 nanowatts per qubit, making it a billion times more efficient than conventional approaches. This scales throughout the quantum system up to the data center level and can help to significantly reduce the energy footprint of quantum data centers.

The rising energy demands for AI account for a notable portion of this increase. The IT infrastructure required to support AI workloads primarily involves user terminals (such as computers and smartphones) and, most importantly, data centers—of which there are over 8,000 worldwide. The U.S. is home to roughly 33%, Europe to 16%, and China to 10%, according to the International Energy Agency (IEA). If Google were to transform its search engine into something akin to ChatGPT, with nine billion chatbot interactions per day instead of regular search queries, the energy demand would surge to levels comparable to the entire nation of Ireland.

Such growth in AI-driven processes comes with considerable financial costs. If Google were to make this switch noted above, it would require a $100 billion investment to meet the power needs. There are currently [four tech](https://www.techtarget.com/whatis/feature/Three-tech-companies-eyeing-nuclear-power-for-AI-energy) companies–Microsoft, Oracle, Amazon, and Google–that have plans to build their own nuclear power plants. There are also considerable environmental costs: Each month, ChatGPT generates around 260,930 kilograms of carbon dioxide–the equivalent of 260 flights between New York and London.

Scaled quantum data centers, as presently proposed, present a similar energy-intensive profile compared to AI-oriented data centers. Yet in contrast to the brute force solution of adding massive energy infrastructure to power future quantum data centers, there are technical approaches in the hybrid quantum-classical domain that can reduce needed electricity from tens of megawatts per system down to tens of kilowatts for scaled quantum computers. For instance, a typical superconducting quantum computer uses approximately 2-5 watts of room temperature electronics to control a single qubit. A quantum system engineered for energy efficiency—using classical superconducting technologies like Single Flux Quantum (SFQ) circuits—can reduce power consumption to just 3 nanowatts per qubit, making it a billion times more efficient than conventional approaches. This scales throughout the quantum system up to the data center level and can help to significantly reduce the energy footprint of quantum data centers.

Data Center Energy Consumption

Data Center Energy Consumption

Data Center Energy Consumption

Source IEA

Source IEA

Source IEA

As AI continues to scale, quantum computing could provide a much-needed solution to reduce the world’s energy footprint, offering transformative solutions across multiple areas—from reducing energy requirements in data centers to improving renewable energy forecasting to optimizing grid management. The following includes some of the ways that quantum computing is poised to revolutionize the energy industry:

As AI continues to scale, quantum computing could provide a much-needed solution to reduce the world’s energy footprint, offering transformative solutions across multiple areas—from reducing energy requirements in data centers to improving renewable energy forecasting to optimizing grid management. The following includes some of the ways that quantum computing is poised to revolutionize the energy industry:

As AI continues to scale, quantum computing could provide a much-needed solution to reduce the world’s energy footprint, offering transformative solutions across multiple areas—from reducing energy requirements in data centers to improving renewable energy forecasting to optimizing grid management. The following includes some of the ways that quantum computing is poised to revolutionize the energy industry:

Renewable Energy Forecasting

Renewable Energy Forecasting

Renewable Energy Forecasting

Quantum computing's ability to handle massive datasets and perform parallel computations could dramatically improve the accuracy of energy forecasting. By processing data from weather models, environmental sensors, and historical energy trends, quantum algorithms can predict fluctuations in renewable energy production with greater precision. This will enable grid operators to better anticipate shifts in supply and demand, ensuring more reliable integration of renewable sources like wind and solar into the power grid.

Quantum computing's ability to handle massive datasets and perform parallel computations could dramatically improve the accuracy of energy forecasting. By processing data from weather models, environmental sensors, and historical energy trends, quantum algorithms can predict fluctuations in renewable energy production with greater precision. This will enable grid operators to better anticipate shifts in supply and demand, ensuring more reliable integration of renewable sources like wind and solar into the power grid.

Quantum computing's ability to handle massive datasets and perform parallel computations could dramatically improve the accuracy of energy forecasting. By processing data from weather models, environmental sensors, and historical energy trends, quantum algorithms can predict fluctuations in renewable energy production with greater precision. This will enable grid operators to better anticipate shifts in supply and demand, ensuring more reliable integration of renewable sources like wind and solar into the power grid.

Optimizing Grid Management

Optimizing Grid Management

Optimizing Grid Management

To balance energy supply and demand while minimizing energy losses, efficient grid management is essential. Quantum computing can help by rapidly analyzing real-time grid conditions and identifying potential bottlenecks or inefficiencies. These insights allow for quick adjustments in energy distribution, improving overall grid performance and ensuring energy is delivered where it’s needed most without the risk of overloads or downtime.

To balance energy supply and demand while minimizing energy losses, efficient grid management is essential. Quantum computing can help by rapidly analyzing real-time grid conditions and identifying potential bottlenecks or inefficiencies. These insights allow for quick adjustments in energy distribution, improving overall grid performance and ensuring energy is delivered where it’s needed most without the risk of overloads or downtime.

To balance energy supply and demand while minimizing energy losses, efficient grid management is essential. Quantum computing can help by rapidly analyzing real-time grid conditions and identifying potential bottlenecks or inefficiencies. These insights allow for quick adjustments in energy distribution, improving overall grid performance and ensuring energy is delivered where it’s needed most without the risk of overloads or downtime.

Enhancing Energy Storage Systems

Enhancing Energy Storage Systems

Enhancing Energy Storage Systems

Energy storage is key to balancing the intermittent nature of renewable energy sources. Quantum computing can optimize the design and deployment of energy storage systems, like advanced batteries, by simulating material properties at a quantum level. This allows for the development of more efficient storage solutions, which are critical for stabilizing the grid and ensuring energy is available when demand peaks.

Energy storage is key to balancing the intermittent nature of renewable energy sources. Quantum computing can optimize the design and deployment of energy storage systems, like advanced batteries, by simulating material properties at a quantum level. This allows for the development of more efficient storage solutions, which are critical for stabilizing the grid and ensuring energy is available when demand peaks.

Energy storage is key to balancing the intermittent nature of renewable energy sources. Quantum computing can optimize the design and deployment of energy storage systems, like advanced batteries, by simulating material properties at a quantum level. This allows for the development of more efficient storage solutions, which are critical for stabilizing the grid and ensuring energy is available when demand peaks.

Improving Demand Response Strategies

Improving Demand Response Strategies

Improving Demand Response Strategies

As demand-response systems evolve, quantum computing can enhance the alignment between energy supply and consumption. By analyzing real-time data, quantum algorithms can help optimize when energy-intensive activities should occur, ensuring that they take place during times of high renewable energy availability. This not only reduces strain on the grid but also helps integrate more renewable energy, making the system more sustainable and cost-effective.

As demand-response systems evolve, quantum computing can enhance the alignment between energy supply and consumption. By analyzing real-time data, quantum algorithms can help optimize when energy-intensive activities should occur, ensuring that they take place during times of high renewable energy availability. This not only reduces strain on the grid but also helps integrate more renewable energy, making the system more sustainable and cost-effective.

As demand-response systems evolve, quantum computing can enhance the alignment between energy supply and consumption. By analyzing real-time data, quantum algorithms can help optimize when energy-intensive activities should occur, ensuring that they take place during times of high renewable energy availability. This not only reduces strain on the grid but also helps integrate more renewable energy, making the system more sustainable and cost-effective.

Quantum-Driven Innovations in Energy Materials

Quantum-Driven Innovations in Energy Materials

Quantum-Driven Innovations in Energy Materials

As demand-response systems evolve, quantum computing can enhance the alignment between energy supply and consumption. By analyzing real-time data, quantum algorithms can help optimize when energy-intensive activities should occur, ensuring that they take place during times of high renewable energy availability. This not only reduces strain on the grid but also helps integrate more renewable energy, making the system more sustainable and cost-effective.

As demand-response systems evolve, quantum computing can enhance the alignment between energy supply and consumption. By analyzing real-time data, quantum algorithms can help optimize when energy-intensive activities should occur, ensuring that they take place during times of high renewable energy availability. This not only reduces strain on the grid but also helps integrate more renewable energy, making the system more sustainable and cost-effective.

As demand-response systems evolve, quantum computing can enhance the alignment between energy supply and consumption. By analyzing real-time data, quantum algorithms can help optimize when energy-intensive activities should occur, ensuring that they take place during times of high renewable energy availability. This not only reduces strain on the grid but also helps integrate more renewable energy, making the system more sustainable and cost-effective.

The Past Offers No Guarantees
for Future Success in the
Quantum Age

The Past Offers No Guarantees
for Future Success in the
Quantum Age

The Past Offers No Guarantees
for Future Success in the
Quantum Age

Since the end of the Second World War, the U.S. has led every stage of the classical computing revolution. In the 1950s, IBM launched its 700-series mainframe computers for use in government and business. A decade later, the company announced the System/360 mainframe family – a “bet-the-company” $5 billion project that “ushered in a new era of compatibility” in computing. In the 1980s and 1990s, the U.S. held more than 90% of the global market for personal computers through companies like IBM, Microsoft, and Apple. This dominance continued well into the 2000s, with Microsoft Windows becoming the standard operating system for PCs worldwide. Similarly, U.S. companies have maintained global leadership across nearly every layer of the classical computing stack—from chip design and application software to networking infrastructure and cloud services. Aside from chip fabrication, the entire classical computing ecosystem remains firmly anchored in American control.

None of this was an accident–it was the result of strategic government policies, funding, and a conducive innovation ecosystem. It has also translated into significant geopolitical advantages, including military superiority, economic power, and an overall higher standard of living.

This could soon become irrelevant. Countries that lag in traditional semiconductor and computing industries are pouring resources into quantum computing as a way to leapfrog the current tech hierarchy. This, of course, includes China, which has invested $15.3 billion into the technology, around four times more than the U.S.

Since the end of the Second World War, the U.S. has led every stage of the classical computing revolution. In the 1950s, IBM launched its 700-series mainframe computers for use in government and business. A decade later, the company announced the System/360 mainframe family – a “bet-the-company” $5 billion project that “ushered in a new era of compatibility” in computing. In the 1980s and 1990s, the U.S. held more than 90% of the global market for personal computers through companies like IBM, Microsoft, and Apple. This dominance continued well into the 2000s, with Microsoft Windows becoming the standard operating system for PCs worldwide. Similarly, U.S. companies have maintained global leadership across nearly every layer of the classical computing stack—from chip design and application software to networking infrastructure and cloud services. Aside from chip fabrication, the entire classical computing ecosystem remains firmly anchored in American control.

None of this was an accident–it was the result of strategic government policies, funding, and a conducive innovation ecosystem. It has also translated into significant geopolitical advantages, including military superiority, economic power, and an overall higher standard of living.

This could soon become irrelevant. Countries that lag in traditional semiconductor and computing industries are pouring resources into quantum computing as a way to leapfrog the current tech hierarchy. This, of course, includes China, which has invested $15.3 billion into the technology, around four times more than the U.S.

Since the end of the Second World War, the U.S. has led every stage of the classical computing revolution. In the 1950s, IBM launched its 700-series mainframe computers for use in government and business. A decade later, the company announced the System/360 mainframe family – a “bet-the-company” $5 billion project that “ushered in a new era of compatibility” in computing. In the 1980s and 1990s, the U.S. held more than 90% of the global market for personal computers through companies like IBM, Microsoft, and Apple. This dominance continued well into the 2000s, with Microsoft Windows becoming the standard operating system for PCs worldwide. Similarly, U.S. companies have maintained global leadership across nearly every layer of the classical computing stack—from chip design and application software to networking infrastructure and cloud services. Aside from chip fabrication, the entire classical computing ecosystem remains firmly anchored in American control.

None of this was an accident–it was the result of strategic government policies, funding, and a conducive innovation ecosystem. It has also translated into significant geopolitical advantages, including military superiority, economic power, and an overall higher standard of living.

This could soon become irrelevant. Countries that lag in traditional semiconductor and computing industries are pouring resources into quantum computing as a way to leapfrog the current tech hierarchy. This, of course, includes China, which has invested $15.3 billion into the technology, around four times more than the U.S.

Figure 6: China’s quantum investment significantly outpaces other nations, highlighting the strategic importance placed on this technology.

Figure 6: China’s quantum investment significantly outpaces other nations, highlighting the strategic importance placed on this technology.

Figure 6: China’s quantum investment significantly outpaces other nations, highlighting the strategic importance placed on this technology.

In 2023, results from experiments suggested that Chinese physicists were notching impressive [achievements](https://time.com/6963281/quantum-computing-history/) that may enable them to construct a quantum computer that could outpace those developed in the U.S. Russia has also placed its bets on quantum. In the last few years, thanks to state investments designed to accelerate quantum capabilities, the country has gone from essentially zero to 50-qubit devices.

U.S. allies are also investing heavily in quantum. With its nearly €7 billion public investment in quantum, the EU ranks second only to China. According to Vishal Chatrath, CEO and co-founder of QuantrolOx, “In Europe, because we lost out in classical computing, we have an opportunity to leapfrog in quantum computing.” India, another nation without a large footprint in classical computing hardware, has launched an ambitious program to become a leader in quantum technology. IIn April 2023, the Indian government approved a National Quantum Mission (NQM) with an outlay of ₹6,003.65 crore (approximately $730 million) over eight years. This mission explicitly frames quantum tech as an “unprecedented opportunity for India to leapfrog” in computing capabilities.

In 2023, results from experiments suggested that Chinese physicists were notching impressive [achievements](https://time.com/6963281/quantum-computing-history/) that may enable them to construct a quantum computer that could outpace those developed in the U.S. Russia has also placed its bets on quantum. In the last few years, thanks to state investments designed to accelerate quantum capabilities, the country has gone from essentially zero to 50-qubit devices.

U.S. allies are also investing heavily in quantum. With its nearly €7 billion public investment in quantum, the EU ranks second only to China. According to Vishal Chatrath, CEO and co-founder of QuantrolOx, “In Europe, because we lost out in classical computing, we have an opportunity to leapfrog in quantum computing.” India, another nation without a large footprint in classical computing hardware, has launched an ambitious program to become a leader in quantum technology. IIn April 2023, the Indian government approved a National Quantum Mission (NQM) with an outlay of ₹6,003.65 crore (approximately $730 million) over eight years. This mission explicitly frames quantum tech as an “unprecedented opportunity for India to leapfrog” in computing capabilities.

In 2023, results from experiments suggested that Chinese physicists were notching impressive [achievements](https://time.com/6963281/quantum-computing-history/) that may enable them to construct a quantum computer that could outpace those developed in the U.S. Russia has also placed its bets on quantum. In the last few years, thanks to state investments designed to accelerate quantum capabilities, the country has gone from essentially zero to 50-qubit devices.

U.S. allies are also investing heavily in quantum. With its nearly €7 billion public investment in quantum, the EU ranks second only to China. According to Vishal Chatrath, CEO and co-founder of QuantrolOx, “In Europe, because we lost out in classical computing, we have an opportunity to leapfrog in quantum computing.” India, another nation without a large footprint in classical computing hardware, has launched an ambitious program to become a leader in quantum technology. IIn April 2023, the Indian government approved a National Quantum Mission (NQM) with an outlay of ₹6,003.65 crore (approximately $730 million) over eight years. This mission explicitly frames quantum tech as an “unprecedented opportunity for India to leapfrog” in computing capabilities.

Global Quantum Computing
Investment by Country

Global Quantum Computing
Investment by Country

Global Quantum Computing
Investment by Country

Figure 9: National quantum computing investments reveal the competitive landscape, with multiple countries vying for quantum leadership.

Figure 9: National quantum computing investments reveal the competitive landscape, with multiple countries vying for quantum leadership.

Figure 9: National quantum computing investments reveal the competitive landscape, with multiple countries vying for quantum leadership.

The history of 5G, an incremental technology advance in an evolutionary wireless roadmap, offers a clear warning. The rollout of 5G in the U.S. and much of the West was characterized by a [slow mobilization](https://www.foreignaffairs.com/united-states/will-china-escalate?utm_source=chatgpt.com) and a [fragmented](https://www.foreignaffairs.com/united-states/china-still-winning-battle-5g-and-6g?utm_source=chatgpt.com), market-driven approach. Unlike previous generations (where Western firms led), the U.S. and its allies did not treat 5G as a unified strategic priority early on. Instead, deployment was left largely to private telecom companies, with minimal centralized industrial policy or government coordination. This led to delays in critical areas – for example, the U.S. was slow to allocate mid-band spectrum (the optimal frequencies for 5G) due to piecemeal policymaking and competing uses of spectrum.

As a result, by the end of 2022, China had around [355 million](https://www.lightreading.com/5g/how-real-is-china-s-5g-gap-) 5G users compared to the 50 million in the U.S. Moreover, China has deployed 3.5 million 5G base stations, which it says account for more than 60% of the world's total. The US has around 175,000. Geopolitically, China’s 5G surge enhanced its strategic influence. By supplying affordable 5G gear to telecom networks across Asia, Africa, and Latin America, Chinese firms became integral to many nations’ digital ecosystems, and with those networks, Chinese firms were able to install proprietary technologies that reduced the overall security and integrity of the data being transmitted and received.

In the case of 5G, Western nations relied too heavily on market forces and fragmented policies, while China executed a top-down, state-driven approach that rapidly scaled infrastructure, secured global market share, and influenced international standards. This highlights the need for coordinated early action, robust public investment, and a clear strategic vision. Quantum computing presents an even greater discontinuity—one that could redefine military power, economic competitiveness, and digital sovereignty. To avoid another cycle of complacency and catch-up, the West must treat quantum as a national and allied priority, investing not only in research but also in commercialization, standards-setting, and supply chain resilience.

The stakes are clear: the leader in quantum and AI will lead the world throughout the remainder of the century. As [Mike Pezzullo,](https://thequantuminsider.com/2025/03/29/guest-post-the-quantum-cold-war-is-here/) as the former Australian Secretary of the Home Affairs Department, put it “Quantum computing combined with advanced AI will rule the world.” This echoes [Vladimir Putin's](https://www.theverge.com/2017/9/4/16251226/russia-ai-putin-rule-the-world) famous statement that, “The nation that dominates the information processing field will possess the keys to world leadership.” Paradoxically, the U.S.'s historical success in classical computing may prove a liability down the road, fostering compliance and institutional inertia in the face of a radically different paradigm. In the quantum age, legacy systems offer no guarantees of continued success. The remainder of the paper will examine where things currently stand and the ways in which the U.S. can avoid falling behind in the quantum race.

The history of 5G, an incremental technology advance in an evolutionary wireless roadmap, offers a clear warning. The rollout of 5G in the U.S. and much of the West was characterized by a [slow mobilization](https://www.foreignaffairs.com/united-states/will-china-escalate?utm_source=chatgpt.com) and a [fragmented](https://www.foreignaffairs.com/united-states/china-still-winning-battle-5g-and-6g?utm_source=chatgpt.com), market-driven approach. Unlike previous generations (where Western firms led), the U.S. and its allies did not treat 5G as a unified strategic priority early on. Instead, deployment was left largely to private telecom companies, with minimal centralized industrial policy or government coordination. This led to delays in critical areas – for example, the U.S. was slow to allocate mid-band spectrum (the optimal frequencies for 5G) due to piecemeal policymaking and competing uses of spectrum.

As a result, by the end of 2022, China had around [355 million](https://www.lightreading.com/5g/how-real-is-china-s-5g-gap-) 5G users compared to the 50 million in the U.S. Moreover, China has deployed 3.5 million 5G base stations, which it says account for more than 60% of the world's total. The US has around 175,000. Geopolitically, China’s 5G surge enhanced its strategic influence. By supplying affordable 5G gear to telecom networks across Asia, Africa, and Latin America, Chinese firms became integral to many nations’ digital ecosystems, and with those networks, Chinese firms were able to install proprietary technologies that reduced the overall security and integrity of the data being transmitted and received.

In the case of 5G, Western nations relied too heavily on market forces and fragmented policies, while China executed a top-down, state-driven approach that rapidly scaled infrastructure, secured global market share, and influenced international standards. This highlights the need for coordinated early action, robust public investment, and a clear strategic vision. Quantum computing presents an even greater discontinuity—one that could redefine military power, economic competitiveness, and digital sovereignty. To avoid another cycle of complacency and catch-up, the West must treat quantum as a national and allied priority, investing not only in research but also in commercialization, standards-setting, and supply chain resilience.

The stakes are clear: the leader in quantum and AI will lead the world throughout the remainder of the century. As [Mike Pezzullo,](https://thequantuminsider.com/2025/03/29/guest-post-the-quantum-cold-war-is-here/) as the former Australian Secretary of the Home Affairs Department, put it “Quantum computing combined with advanced AI will rule the world.” This echoes [Vladimir Putin's](https://www.theverge.com/2017/9/4/16251226/russia-ai-putin-rule-the-world) famous statement that, “The nation that dominates the information processing field will possess the keys to world leadership.” Paradoxically, the U.S.'s historical success in classical computing may prove a liability down the road, fostering compliance and institutional inertia in the face of a radically different paradigm. In the quantum age, legacy systems offer no guarantees of continued success. The remainder of the paper will examine where things currently stand and the ways in which the U.S. can avoid falling behind in the quantum race.

The history of 5G, an incremental technology advance in an evolutionary wireless roadmap, offers a clear warning. The rollout of 5G in the U.S. and much of the West was characterized by a [slow mobilization](https://www.foreignaffairs.com/united-states/will-china-escalate?utm_source=chatgpt.com) and a [fragmented](https://www.foreignaffairs.com/united-states/china-still-winning-battle-5g-and-6g?utm_source=chatgpt.com), market-driven approach. Unlike previous generations (where Western firms led), the U.S. and its allies did not treat 5G as a unified strategic priority early on. Instead, deployment was left largely to private telecom companies, with minimal centralized industrial policy or government coordination. This led to delays in critical areas – for example, the U.S. was slow to allocate mid-band spectrum (the optimal frequencies for 5G) due to piecemeal policymaking and competing uses of spectrum.

As a result, by the end of 2022, China had around [355 million](https://www.lightreading.com/5g/how-real-is-china-s-5g-gap-) 5G users compared to the 50 million in the U.S. Moreover, China has deployed 3.5 million 5G base stations, which it says account for more than 60% of the world's total. The US has around 175,000. Geopolitically, China’s 5G surge enhanced its strategic influence. By supplying affordable 5G gear to telecom networks across Asia, Africa, and Latin America, Chinese firms became integral to many nations’ digital ecosystems, and with those networks, Chinese firms were able to install proprietary technologies that reduced the overall security and integrity of the data being transmitted and received.

In the case of 5G, Western nations relied too heavily on market forces and fragmented policies, while China executed a top-down, state-driven approach that rapidly scaled infrastructure, secured global market share, and influenced international standards. This highlights the need for coordinated early action, robust public investment, and a clear strategic vision. Quantum computing presents an even greater discontinuity—one that could redefine military power, economic competitiveness, and digital sovereignty. To avoid another cycle of complacency and catch-up, the West must treat quantum as a national and allied priority, investing not only in research but also in commercialization, standards-setting, and supply chain resilience.

The stakes are clear: the leader in quantum and AI will lead the world throughout the remainder of the century. As [Mike Pezzullo,](https://thequantuminsider.com/2025/03/29/guest-post-the-quantum-cold-war-is-here/) as the former Australian Secretary of the Home Affairs Department, put it “Quantum computing combined with advanced AI will rule the world.” This echoes [Vladimir Putin's](https://www.theverge.com/2017/9/4/16251226/russia-ai-putin-rule-the-world) famous statement that, “The nation that dominates the information processing field will possess the keys to world leadership.” Paradoxically, the U.S.'s historical success in classical computing may prove a liability down the road, fostering compliance and institutional inertia in the face of a radically different paradigm. In the quantum age, legacy systems offer no guarantees of continued success. The remainder of the paper will examine where things currently stand and the ways in which the U.S. can avoid falling behind in the quantum race.

The Current Landscape:
Opportunities & Gaps

The Current Landscape:
Opportunities & Gaps

The Current Landscape:
Opportunities & Gaps

Though much remains to be done, the U.S. has taken strides over the last several years to make quantum technology a national priority. The 2018 National Quantum Initiative (NQI) created a coordinated federal program to support research and accelerate development. Following its passage, multiple agencies—including the White House, the National Science Foundation (NSF), the Department of Energy (DOE), the Department of Defense (DoD), the National Aeronautics and Space Administration (NASA), Department of Homeland Security (DHS), US Department of Agriculture (USDA) and the National Institute of Standards and Technology (NIST)—launched new quantum research centers, established public-private partnerships, and built early-stage infrastructure. Overall, the initial NQI authorization supported about $2.6 billion in quantum activities through 2023. A reauthorization bill, introduced in 2024, sought to expand the initiative through 2034 and shift focus from basic research to maturity and deployment.

This reorientation is sorely needed. Most quantum companies in the U.S. trace their roots to academia, and while their scientific accomplishments are impressive, commercial viability remains difficult to reach. Hardware companies in particular face daunting capital requirements and long lead times, especially those that manufacture their own chips. Many are also locked into specific modalities—trapped ions, superconducting circuits, photonics, or neutral atoms—each with distinct advantages but no clear frontrunner. Switching tracks is rarely feasible. As a result, much of the ecosystem remains siloed, speculative, and fragile.

Recognizing these limitations, several new legislative and agency efforts have emerged to close the gap between research and real-world capability. The Quantum Energy Bill, sponsored by Senator Dick Durbin along with Senator Steve Daines, proposes $2.5 billion over five years to fund DOE-led development and demonstration of practical quantum systems. The bill is notable for its focus on national energy infrastructure and clean technology applications, both critical to long-term competitiveness and security. It could also include energy benchmarks for scaling quantum data centers to help the industry avoid near-term forecasted energy shortages projected for AI-focused data centers.

At the same time, DARPA has launched the Quantum Benchmarking Initiative, a structured, stage-gated program designed to answer a single question: Can a utility-scale quantum computer be built within the next decade? The program funds multiple hardware paths and includes independent validation teams to assess performance. It represents a welcome shift toward realism and technical accountability—a rare corrective in a field often inflated by hype. Companies such as IBM (partnered with SEEQC), Rigetti, Microsoft, HP Enterprises, PsiQuantum and 15 other companies have been selected for the initial round.

Though much remains to be done, the U.S. has taken strides over the last several years to make quantum technology a national priority. The 2018 National Quantum Initiative (NQI) created a coordinated federal program to support research and accelerate development. Following its passage, multiple agencies—including the White House, the National Science Foundation (NSF), the Department of Energy (DOE), the Department of Defense (DoD), the National Aeronautics and Space Administration (NASA), Department of Homeland Security (DHS), US Department of Agriculture (USDA) and the National Institute of Standards and Technology (NIST)—launched new quantum research centers, established public-private partnerships, and built early-stage infrastructure. Overall, the initial NQI authorization supported about $2.6 billion in quantum activities through 2023. A reauthorization bill, introduced in 2024, sought to expand the initiative through 2034 and shift focus from basic research to maturity and deployment.

This reorientation is sorely needed. Most quantum companies in the U.S. trace their roots to academia, and while their scientific accomplishments are impressive, commercial viability remains difficult to reach. Hardware companies in particular face daunting capital requirements and long lead times, especially those that manufacture their own chips. Many are also locked into specific modalities—trapped ions, superconducting circuits, photonics, or neutral atoms—each with distinct advantages but no clear frontrunner. Switching tracks is rarely feasible. As a result, much of the ecosystem remains siloed, speculative, and fragile.

Recognizing these limitations, several new legislative and agency efforts have emerged to close the gap between research and real-world capability. The Quantum Energy Bill, sponsored by Senator Dick Durbin along with Senator Steve Daines, proposes $2.5 billion over five years to fund DOE-led development and demonstration of practical quantum systems. The bill is notable for its focus on national energy infrastructure and clean technology applications, both critical to long-term competitiveness and security. It could also include energy benchmarks for scaling quantum data centers to help the industry avoid near-term forecasted energy shortages projected for AI-focused data centers.

At the same time, DARPA has launched the Quantum Benchmarking Initiative, a structured, stage-gated program designed to answer a single question: Can a utility-scale quantum computer be built within the next decade? The program funds multiple hardware paths and includes independent validation teams to assess performance. It represents a welcome shift toward realism and technical accountability—a rare corrective in a field often inflated by hype. Companies such as IBM (partnered with SEEQC), Rigetti, Microsoft, HP Enterprises, PsiQuantum and 15 other companies have been selected for the initial round.

Though much remains to be done, the U.S. has taken strides over the last several years to make quantum technology a national priority. The 2018 National Quantum Initiative (NQI) created a coordinated federal program to support research and accelerate development. Following its passage, multiple agencies—including the White House, the National Science Foundation (NSF), the Department of Energy (DOE), the Department of Defense (DoD), the National Aeronautics and Space Administration (NASA), Department of Homeland Security (DHS), US Department of Agriculture (USDA) and the National Institute of Standards and Technology (NIST)—launched new quantum research centers, established public-private partnerships, and built early-stage infrastructure. Overall, the initial NQI authorization supported about $2.6 billion in quantum activities through 2023. A reauthorization bill, introduced in 2024, sought to expand the initiative through 2034 and shift focus from basic research to maturity and deployment.

This reorientation is sorely needed. Most quantum companies in the U.S. trace their roots to academia, and while their scientific accomplishments are impressive, commercial viability remains difficult to reach. Hardware companies in particular face daunting capital requirements and long lead times, especially those that manufacture their own chips. Many are also locked into specific modalities—trapped ions, superconducting circuits, photonics, or neutral atoms—each with distinct advantages but no clear frontrunner. Switching tracks is rarely feasible. As a result, much of the ecosystem remains siloed, speculative, and fragile.

Recognizing these limitations, several new legislative and agency efforts have emerged to close the gap between research and real-world capability. The Quantum Energy Bill, sponsored by Senator Dick Durbin along with Senator Steve Daines, proposes $2.5 billion over five years to fund DOE-led development and demonstration of practical quantum systems. The bill is notable for its focus on national energy infrastructure and clean technology applications, both critical to long-term competitiveness and security. It could also include energy benchmarks for scaling quantum data centers to help the industry avoid near-term forecasted energy shortages projected for AI-focused data centers.

At the same time, DARPA has launched the Quantum Benchmarking Initiative, a structured, stage-gated program designed to answer a single question: Can a utility-scale quantum computer be built within the next decade? The program funds multiple hardware paths and includes independent validation teams to assess performance. It represents a welcome shift toward realism and technical accountability—a rare corrective in a field often inflated by hype. Companies such as IBM (partnered with SEEQC), Rigetti, Microsoft, HP Enterprises, PsiQuantum and 15 other companies have been selected for the initial round.

Meanwhile, the DoD has begun expanding its Microelectronics (ME) Commons to include quantum-focused projects These regional innovation hubs, funded through the CHIPS and Science Act, aim to close the prototyping gap and build out domestic capacity for critical quantum components—from cryogenic control systems to precision optics. While the CHIPS Act is often associated with semiconductors, many quantum technologies don’t rely on conventional semiconductor designs. Instead, they require unique fabrication methods, materials, and packaging—making targeted investment in quantum-specific manufacturing not just important, but essential.

A secure domestic supply chain is especially urgent. Quantum systems depend on an array of specialized components: qubits, superconducting Single Flux Quantum chips for control and readout, vacuum chambers, cryogenic refrigeration, laser systems, and superconducting cabling—many of which are currently sourced primarily from Europe and East Asia. These dependencies expose the U.S. to strategic risk, particularly with global tensions and techno-nationalism on the rise. Furthermore, key enabling technologies like modular chip-to-chip interconnects—critical for building scalable quantum architectures—remain underdeveloped and underprotected in the U.S. While companies such as Rigetti and IBM have made important advances in this area, these interconnects require highly specialized techniques in advanced packaging, an area where the U.S. lags behind manufacturing giants like Taiwan’s TSMC. As we’ve seen in classical computing—particularly in NVIDIA’s dependence on TSMC for cutting-edge GPUs—advanced packaging is not a luxury; it’s a strategic necessity. Yet venture capital alone is unlikely to fund the capital-intensive infrastructure required to onshore these capabilities.

Without deliberate public investment, this manufacturing gap could stall the broader U.S. quantum effort. Although the National Science and Technology Council (NSTC)  currently oversees interagency coordination, it lacks the budgetary authority to drive national deployment strategies. The recently proposed NATCAST framework—a placeholder for a more comprehensive, cross-sector approach—acknowledges how fragmented the landscape remains. Without clear national benchmarks, secure supply chains, and a coordinated deployment pathway, the U.S. risks repeating the same missteps it made in the 5G era: disjointed action, slow rollout, and loss of strategic influence.

Meanwhile, the DoD has begun expanding its Microelectronics (ME) Commons to include quantum-focused projects These regional innovation hubs, funded through the CHIPS and Science Act, aim to close the prototyping gap and build out domestic capacity for critical quantum components—from cryogenic control systems to precision optics. While the CHIPS Act is often associated with semiconductors, many quantum technologies don’t rely on conventional semiconductor designs. Instead, they require unique fabrication methods, materials, and packaging—making targeted investment in quantum-specific manufacturing not just important, but essential.

A secure domestic supply chain is especially urgent. Quantum systems depend on an array of specialized components: qubits, superconducting Single Flux Quantum chips for control and readout, vacuum chambers, cryogenic refrigeration, laser systems, and superconducting cabling—many of which are currently sourced primarily from Europe and East Asia. These dependencies expose the U.S. to strategic risk, particularly with global tensions and techno-nationalism on the rise. Furthermore, key enabling technologies like modular chip-to-chip interconnects—critical for building scalable quantum architectures—remain underdeveloped and underprotected in the U.S. While companies such as Rigetti and IBM have made important advances in this area, these interconnects require highly specialized techniques in advanced packaging, an area where the U.S. lags behind manufacturing giants like Taiwan’s TSMC. As we’ve seen in classical computing—particularly in NVIDIA’s dependence on TSMC for cutting-edge GPUs—advanced packaging is not a luxury; it’s a strategic necessity. Yet venture capital alone is unlikely to fund the capital-intensive infrastructure required to onshore these capabilities.

Without deliberate public investment, this manufacturing gap could stall the broader U.S. quantum effort. Although the National Science and Technology Council (NSTC)  currently oversees interagency coordination, it lacks the budgetary authority to drive national deployment strategies. The recently proposed NATCAST framework—a placeholder for a more comprehensive, cross-sector approach—acknowledges how fragmented the landscape remains. Without clear national benchmarks, secure supply chains, and a coordinated deployment pathway, the U.S. risks repeating the same missteps it made in the 5G era: disjointed action, slow rollout, and loss of strategic influence.

Meanwhile, the DoD has begun expanding its Microelectronics (ME) Commons to include quantum-focused projects These regional innovation hubs, funded through the CHIPS and Science Act, aim to close the prototyping gap and build out domestic capacity for critical quantum components—from cryogenic control systems to precision optics. While the CHIPS Act is often associated with semiconductors, many quantum technologies don’t rely on conventional semiconductor designs. Instead, they require unique fabrication methods, materials, and packaging—making targeted investment in quantum-specific manufacturing not just important, but essential.

A secure domestic supply chain is especially urgent. Quantum systems depend on an array of specialized components: qubits, superconducting Single Flux Quantum chips for control and readout, vacuum chambers, cryogenic refrigeration, laser systems, and superconducting cabling—many of which are currently sourced primarily from Europe and East Asia. These dependencies expose the U.S. to strategic risk, particularly with global tensions and techno-nationalism on the rise. Furthermore, key enabling technologies like modular chip-to-chip interconnects—critical for building scalable quantum architectures—remain underdeveloped and underprotected in the U.S. While companies such as Rigetti and IBM have made important advances in this area, these interconnects require highly specialized techniques in advanced packaging, an area where the U.S. lags behind manufacturing giants like Taiwan’s TSMC. As we’ve seen in classical computing—particularly in NVIDIA’s dependence on TSMC for cutting-edge GPUs—advanced packaging is not a luxury; it’s a strategic necessity. Yet venture capital alone is unlikely to fund the capital-intensive infrastructure required to onshore these capabilities.

Without deliberate public investment, this manufacturing gap could stall the broader U.S. quantum effort. Although the National Science and Technology Council (NSTC)  currently oversees interagency coordination, it lacks the budgetary authority to drive national deployment strategies. The recently proposed NATCAST framework—a placeholder for a more comprehensive, cross-sector approach—acknowledges how fragmented the landscape remains. Without clear national benchmarks, secure supply chains, and a coordinated deployment pathway, the U.S. risks repeating the same missteps it made in the 5G era: disjointed action, slow rollout, and loss of strategic influence.

What Needs to Be Done

What Needs to Be Done

What Needs to Be Done

While America’s research base remains the strongest in the world, research alone will not win the quantum race. If the U.S. is to lead, it must learn from its past successes in classical computing—not to repeat them, but to reimagine what success looks like in a radically different technological paradigm. That will require not just continued investment, but high-level coordination, supply chain resilience, and the political will to treat quantum as the strategic priority it already is.

Meeting this moment will involve action on three fronts: modernizing industrial policy, scaling quantum
manufacturing infrastructure, and creating the public-private mechanisms needed to translate research into real-world systems. The following recommendations outline how the U.S. can close existing gaps—and move from leadership in theory to leadership in practice.

While America’s research base remains the strongest in the world, research alone will not win the quantum race. If the U.S. is to lead, it must learn from its past successes in classical computing—not to repeat them, but to reimagine what success looks like in a radically different technological paradigm. That will require not just continued investment, but high-level coordination, supply chain resilience, and the political will to treat quantum as the strategic priority it already is.

Meeting this moment will involve action on three fronts: modernizing industrial policy, scaling quantum
manufacturing infrastructure, and creating the public-private mechanisms needed to translate research into real-world systems. The following recommendations outline how the U.S. can close existing gaps—and move from leadership in theory to leadership in practice.

While America’s research base remains the strongest in the world, research alone will not win the quantum race. If the U.S. is to lead, it must learn from its past successes in classical computing—not to repeat them, but to reimagine what success looks like in a radically different technological paradigm. That will require not just continued investment, but high-level coordination, supply chain resilience, and the political will to treat quantum as the strategic priority it already is.

Meeting this moment will involve action on three fronts: modernizing industrial policy, scaling quantum
manufacturing infrastructure, and creating the public-private mechanisms needed to translate research into real-world systems. The following recommendations outline how the U.S. can close existing gaps—and move from leadership in theory to leadership in practice.

Figure 7: The quantum computing timeline shows accelerating progress toward practical applications and commercial viability.

Figure 7: The quantum computing timeline shows accelerating progress toward practical applications and commercial viability.

Figure 7: The quantum computing timeline shows accelerating progress toward practical applications and commercial viability.

Invest in Manufacturing at Scale

Invest in Manufacturing at Scale

Invest in Manufacturing at Scale

Most quantum components are made in limited quantities or overseas. The U.S. doesn’t have dedicated facilities capable of fabricating quantum chips or manufacture the superconducting systems and photonic hardware needed to scale. Changing this is a priority. The CHIPS Act helped jumpstart domestic production of semiconductors. It should now do the same for quantum.

Congress should fully fund DOE’s proposed Quantum Energy Infrastructure Bill, including its $250 million allocation for a national quantum foundry and benchmarking of energy efficient quantum systems. We also need pilot production lines, open-access testbeds, and regional hubs like those now emerging in Illinois, Colorado, and New York. The goal isn’t just to build powerful quantum computers—it’s to make sure they’re built in America.

Most quantum components are made in limited quantities or overseas. The U.S. doesn’t have dedicated facilities capable of fabricating quantum chips or manufacture the superconducting systems and photonic hardware needed to scale. Changing this is a priority. The CHIPS Act helped jumpstart domestic production of semiconductors. It should now do the same for quantum.

Congress should fully fund DOE’s proposed Quantum Energy Infrastructure Bill, including its $250 million allocation for a national quantum foundry and benchmarking of energy efficient quantum systems. We also need pilot production lines, open-access testbeds, and regional hubs like those now emerging in Illinois, Colorado, and New York. The goal isn’t just to build powerful quantum computers—it’s to make sure they’re built in America.

Most quantum components are made in limited quantities or overseas. The U.S. doesn’t have dedicated facilities capable of fabricating quantum chips or manufacture the superconducting systems and photonic hardware needed to scale. Changing this is a priority. The CHIPS Act helped jumpstart domestic production of semiconductors. It should now do the same for quantum.

Congress should fully fund DOE’s proposed Quantum Energy Infrastructure Bill, including its $250 million allocation for a national quantum foundry and benchmarking of energy efficient quantum systems. We also need pilot production lines, open-access testbeds, and regional hubs like those now emerging in Illinois, Colorado, and New York. The goal isn’t just to build powerful quantum computers—it’s to make sure they’re built in America.

Securing the Supply Chain

Securing the Supply Chain

Securing the Supply Chain

Quantum hardware is fragile. It depends on materials and subsystems that are manufactured by a handful of firms. Many of them are abroad. Just one geopolitical shock could bring the entire sector to a halt. Congress should pass the Support for Quantum Supply Chains Act and direct NIST and the QED-C to assess and reinforce domestic sourcing.

This means identifying choke points, stockpiling rare materials, and expanding U.S. production of high-purity isotopes, chips for critical quantum computing infrastructure, dilution refrigerators, and precision lasers. It also means treating this like the national security issue it is.

DoD and DOE should continue to back multi-modality testbeds and centers of excellence. This approach increases our odds of success and ensures we’re not locked into a dead-end architecture.

Quantum hardware is fragile. It depends on materials and subsystems that are manufactured by a handful of firms. Many of them are abroad. Just one geopolitical shock could bring the entire sector to a halt. Congress should pass the Support for Quantum Supply Chains Act and direct NIST and the QED-C to assess and reinforce domestic sourcing.

This means identifying choke points, stockpiling rare materials, and expanding U.S. production of high-purity isotopes, chips for critical quantum computing infrastructure, dilution refrigerators, and precision lasers. It also means treating this like the national security issue it is.

DoD and DOE should continue to back multi-modality testbeds and centers of excellence. This approach increases our odds of success and ensures we’re not locked into a dead-end architecture.

Quantum hardware is fragile. It depends on materials and subsystems that are manufactured by a handful of firms. Many of them are abroad. Just one geopolitical shock could bring the entire sector to a halt. Congress should pass the Support for Quantum Supply Chains Act and direct NIST and the QED-C to assess and reinforce domestic sourcing.

This means identifying choke points, stockpiling rare materials, and expanding U.S. production of high-purity isotopes, chips for critical quantum computing infrastructure, dilution refrigerators, and precision lasers. It also means treating this like the national security issue it is.

DoD and DOE should continue to back multi-modality testbeds and centers of excellence. This approach increases our odds of success and ensures we’re not locked into a dead-end architecture.

Modernize the CHIPS Act

Modernize the CHIPS Act

Modernize the CHIPS Act

As currently written, the CHIPS Act is about semiconductors. Quantum systems often don’t use them—and when they do use chips, they require completely different materials and processes, such as superconductors. Policymakers should expand CHIPS Act eligibility to explicitly cover quantum and classical superconducting chip fabrication and prototyping. That includes support for specialized quantum labs, tooling, and workforce pipelines.

As currently written, the CHIPS Act is about semiconductors. Quantum systems often don’t use them—and when they do use chips, they require completely different materials and processes, such as superconductors. Policymakers should expand CHIPS Act eligibility to explicitly cover quantum and classical superconducting chip fabrication and prototyping. That includes support for specialized quantum labs, tooling, and workforce pipelines.

As currently written, the CHIPS Act is about semiconductors. Quantum systems often don’t use them—and when they do use chips, they require completely different materials and processes, such as superconductors. Policymakers should expand CHIPS Act eligibility to explicitly cover quantum and classical superconducting chip fabrication and prototyping. That includes support for specialized quantum labs, tooling, and workforce pipelines.

Incentivize Open Standards and
Public-Private Collaboration

Incentivize Open Standards and
Public-Private Collaboration

Incentivize Open Standards and
Public-Private Collaboration

Quantum breakthroughs rarely happen in isolation. Progress depends on shared tools, interoperable systems, and collaboration across academia, industry, and government. The U.S. should expand programs like the QED-C, strengthen NIST’s role in standards-setting, and ensure public funding supports open-source toolkits and shared benchmarks.

Programs like DARPA’s Quantum Benchmarking Initiative are a good start. But they should be scaled up. Open, transparent evaluation criteria—not marketing claims—will be essential to maintaining public trust and commercial credibility.

Quantum breakthroughs rarely happen in isolation. Progress depends on shared tools, interoperable systems, and collaboration across academia, industry, and government. The U.S. should expand programs like the QED-C, strengthen NIST’s role in standards-setting, and ensure public funding supports open-source toolkits and shared benchmarks.

Programs like DARPA’s Quantum Benchmarking Initiative are a good start. But they should be scaled up. Open, transparent evaluation criteria—not marketing claims—will be essential to maintaining public trust and commercial credibility.

Quantum breakthroughs rarely happen in isolation. Progress depends on shared tools, interoperable systems, and collaboration across academia, industry, and government. The U.S. should expand programs like the QED-C, strengthen NIST’s role in standards-setting, and ensure public funding supports open-source toolkits and shared benchmarks.

Programs like DARPA’s Quantum Benchmarking Initiative are a good start. But they should be scaled up. Open, transparent evaluation criteria—not marketing claims—will be essential to maintaining public trust and commercial credibility.

Conclusion

Conclusion

Conclusion

In 2001, Ray Kurzweil—formerly Google’s head of engineering—published a seminal essay titled titled “The Law of Accelerating Returns.” In it, he notes, "Our ancestors expected the future to be pretty much like their present, which had been pretty much like their past. We expect the changes over the next 100 years to mirror the changes of the last 100 years. But in the 21st century, we might see the equivalent of 20,000 years of progress (at today’s rate).”

The blogger Tim Urban uses a metaphor of human history as a 1,000-page book. From pages 1 to 999 (250,00 years ago to the late 18th century), not much changed. The global population stayed below one billion. Transportation meant walking or riding horses and camels. Energy came from human and animal muscle, windmills, and water wheels. Communication was limited to speech, letters, and smoke signals.

But then comes page 1,000—the last 250 years—and everything changes. We see the rise of steamships, trains, automobiles, planes, and rockets. Energy shifts to fossil fuels, nuclear fission, and renewables. Communication leaps from the telegraph to telephones, emails, video calls, and global networks. In just one page, human civilization accelerates more than in the 999 before it.

If all of this happened on the final page—and if Kurzweil is right that progress is accelerating exponentially—then we are no longer just turning the page of history. We’re entering a new volume altogether. The question is: What kind of story will we write next?

As we’ve seen, quantum computing holds extraordinary promise. It could help us cure diseases, optimize energy systems, transform communication, and unlock solutions to some of humanity’s most pressing challenges. In the right hands, it could enable a future that is healthier, more sustainable, and more equitable.

But this outcome is far from guaranteed. These same technologies, if misused or monopolized, could just as easily accelerate inequality, destabilize global systems, or entrench authoritarian control. The very power that makes quantum computing transformative also makes it dangerous in the absence of wisdom, foresight, and shared values.

And yet, our institutions, leadership models, and systems of governance are still designed for a slower world. We are confronting exponential change with linear tools. As a result, our instincts—once essential for survival—are increasingly mismatched with the scale and speed of what we now face.

We have a choice. We can treat quantum computing as a distant science project—or we can recognize it as the geopolitical and moral frontier that it is. The nations that lead boldly and wisely will not only define the next chapter of global power, but also the values embedded in the systems that shape our collective future.

In 2001, Ray Kurzweil—formerly Google’s head of engineering—published a seminal essay titled titled “The Law of Accelerating Returns.” In it, he notes, "Our ancestors expected the future to be pretty much like their present, which had been pretty much like their past. We expect the changes over the next 100 years to mirror the changes of the last 100 years. But in the 21st century, we might see the equivalent of 20,000 years of progress (at today’s rate).”

The blogger Tim Urban uses a metaphor of human history as a 1,000-page book. From pages 1 to 999 (250,00 years ago to the late 18th century), not much changed. The global population stayed below one billion. Transportation meant walking or riding horses and camels. Energy came from human and animal muscle, windmills, and water wheels. Communication was limited to speech, letters, and smoke signals.

But then comes page 1,000—the last 250 years—and everything changes. We see the rise of steamships, trains, automobiles, planes, and rockets. Energy shifts to fossil fuels, nuclear fission, and renewables. Communication leaps from the telegraph to telephones, emails, video calls, and global networks. In just one page, human civilization accelerates more than in the 999 before it.

If all of this happened on the final page—and if Kurzweil is right that progress is accelerating exponentially—then we are no longer just turning the page of history. We’re entering a new volume altogether. The question is: What kind of story will we write next?

As we’ve seen, quantum computing holds extraordinary promise. It could help us cure diseases, optimize energy systems, transform communication, and unlock solutions to some of humanity’s most pressing challenges. In the right hands, it could enable a future that is healthier, more sustainable, and more equitable.

But this outcome is far from guaranteed. These same technologies, if misused or monopolized, could just as easily accelerate inequality, destabilize global systems, or entrench authoritarian control. The very power that makes quantum computing transformative also makes it dangerous in the absence of wisdom, foresight, and shared values.

And yet, our institutions, leadership models, and systems of governance are still designed for a slower world. We are confronting exponential change with linear tools. As a result, our instincts—once essential for survival—are increasingly mismatched with the scale and speed of what we now face.

We have a choice. We can treat quantum computing as a distant science project—or we can recognize it as the geopolitical and moral frontier that it is. The nations that lead boldly and wisely will not only define the next chapter of global power, but also the values embedded in the systems that shape our collective future.

In 2001, Ray Kurzweil—formerly Google’s head of engineering—published a seminal essay titled titled “The Law of Accelerating Returns.” In it, he notes, "Our ancestors expected the future to be pretty much like their present, which had been pretty much like their past. We expect the changes over the next 100 years to mirror the changes of the last 100 years. But in the 21st century, we might see the equivalent of 20,000 years of progress (at today’s rate).”

The blogger Tim Urban uses a metaphor of human history as a 1,000-page book. From pages 1 to 999 (250,00 years ago to the late 18th century), not much changed. The global population stayed below one billion. Transportation meant walking or riding horses and camels. Energy came from human and animal muscle, windmills, and water wheels. Communication was limited to speech, letters, and smoke signals.

But then comes page 1,000—the last 250 years—and everything changes. We see the rise of steamships, trains, automobiles, planes, and rockets. Energy shifts to fossil fuels, nuclear fission, and renewables. Communication leaps from the telegraph to telephones, emails, video calls, and global networks. In just one page, human civilization accelerates more than in the 999 before it.

If all of this happened on the final page—and if Kurzweil is right that progress is accelerating exponentially—then we are no longer just turning the page of history. We’re entering a new volume altogether. The question is: What kind of story will we write next?

As we’ve seen, quantum computing holds extraordinary promise. It could help us cure diseases, optimize energy systems, transform communication, and unlock solutions to some of humanity’s most pressing challenges. In the right hands, it could enable a future that is healthier, more sustainable, and more equitable.

But this outcome is far from guaranteed. These same technologies, if misused or monopolized, could just as easily accelerate inequality, destabilize global systems, or entrench authoritarian control. The very power that makes quantum computing transformative also makes it dangerous in the absence of wisdom, foresight, and shared values.

And yet, our institutions, leadership models, and systems of governance are still designed for a slower world. We are confronting exponential change with linear tools. As a result, our instincts—once essential for survival—are increasingly mismatched with the scale and speed of what we now face.

We have a choice. We can treat quantum computing as a distant science project—or we can recognize it as the geopolitical and moral frontier that it is. The nations that lead boldly and wisely will not only define the next chapter of global power, but also the values embedded in the systems that shape our collective future.