More Innovation Through Different Computation: Quantum Computing and the New Substrate of Science
Because tomorrow's challenges can't be solved with yesterday's computers
Welcome! This is The Quantum Stack. Created by a Travis L. Scholten, it brings clarity to the topic of quantum computing, with an aim to separate hype from reality and help readers stay on top of trends and opportunities. To subscribe, click the button below.
AI transparency note: This article was developed through a collaborative, “human-in-the-loop” process. Rather than simply prompting a model (Gemini) to “write a post,” I provided the core themes, personal anecdotes, and specific technical arguments from the recent guest lecture discussed below. I then worked interactively with Gemini to refine the narrative structure and word choice. I then engaged in final editorial review and edits. While Gemini assisted in synthesizing these thoughts, the creative intent, philosophical framework, and strategic conclusions are entirely my own. I welcome your thoughts on this process, as well as the result.
Executive Summary: The Compute Substrate
The Core Thesis: Traditional computational architectures are reaching their functional limits, necessitating a shift toward a new “substrate” of science to solve humanity’s most complex challenges.
The Discovery Flywheel: Innovation is driven by a self-reinforcing cycle where data begets compute, which enables new applications, which in turn generate more data.
The Genesis Mission: A national “down payment” on this substrate, the DOE’s Genesis Mission aims to double American scientific productivity by 2036 through a unified data, compute, and AI platform.
Quantum-Centric Supercomputing (QCSC): The future of HPC is true “heterogeneity”, shifting towards the integration of CPUs, GPUs, QPUs, and other special-purpose chips, and all managed and orchestrated in new ways.
National Strategy: Compute has become the next great American infrastructure—on par with the Interstate Highway System. In the modern era, to out-compute is to out-compete.
Human Flourishing: This transition must be architected with intentionality, ensuring that new capabilities serve the common good and promote true human dignity.
Last week, I visited Fairfield University to guest-lecture a physics elective designed specifically for non-science majors. Teaching quantum computing to a room full of students who aren’t necessarily pursuing STEM degrees is a powerful exercise in clarity. It strips away the comfort of technical jargon and forces a speaker to answer one fundamental question: “So what?”
The goal of our session was to explore how emerging forms of computation are transforming our collective ability to understand the world and innovate within it. As scientific and societal challenges grow more complex, we are finding that traditional computers are reaching their functional limits. We are now entering an era where evolving computational architectures are reshaping the nature of discovery itself.
One of the most rewarding “lightbulb moments” during the session happened during a discussion on careers. Many students assumed that working in quantum computing required a PhD in Physics. I was able to share that the field desperately needs expertise outside of STEM—in policy, business, and strategy. Several students left the room realizing that they could have a seat at the table in the quantum era without being the ones writing the code or building the hardware.
Below are the major themes we explored.
1. The Discovery Flywheel and the Genesis Mission
Innovation is not a linear progression; it is a spiral. Data begets the need for compute; new compute enables new applications; and those applications, in turn, generate even more data. Each turn of this “discovery flywheel” raises the stakes (and the impact) of the next turn.
A prime example of a “down payment” on realizing the next turn of the flywheel is the DOE’s Genesis Mission. Genesis perfectly encapsulates this “data-compute-applications” paradigm. Its primary goal is to unearth decades of “locked” data within the U.S. National Labs and pair it with the American Science and Security Platform (the compute) to solve various “grand challenges” (the applications). By doing so, the Genesis Mission aims to trigger a new scientific revolution, doubling American scientific productivity within a decade.
2. From Soloists to Substrate
For decades, High-Performance Computing (HPC) relied on the CPU as a solo performer. The rise of GPUs began shifting the focus toward bandwidth, topologies, and data flows across a heterogeneous compute system. Today, we are witnessing a “Cambrian explosion” of unconventional computing hardware—such as quantum computing, specialized AI accelerators (e.g., Cerebras, Groq, etc.), and other kinds of physics-informed computational chips. “Heterogenous compute” is becoming more diverse, necessitating a change in how we architect advanced computing systems.
An architecture we are moving toward is Quantum-Centric Supercomputing (QCSC). This is not about one technology replacing another, but about the composition of capabilities. As we integrate CPUs, GPUs, and QPUs, we face the challenge of “hardware-aware” workflow management. There is significant potential for AI to serve a key role here—managing how workloads are executed across heterogeneous hardware to manage the physical realities of latency and bandwidth and the computational realities of the specific and unique capabilities of that hardware.
(As an aside, looking back at this article from about 18 months ago, it’s crazy to consider just how much has happened at the intersection of quantum and HPC!)
3. Building the National Foundation
History shows that those who build the substrate create the future.
The Homestead Act and Rural Electrification Act provided the physical and energy foundations for growth.
The Interstate Highway Act unified a fragmented transportation network into a cohesive whole.
ARPANET/NSFNET laid the digital tracks we still travel today.
My hypothesis is that compute is our next great American substrate. Just as the highways facilitated the movement of assets, a unified computational substrate will facilitate the movement and processing of ideas. “Discovery at the speed of thought.” and all that. (Though turns out that thought is quite slow!) By expanding what can be simulated and designed, we change the trajectory of society.
4. Out-Computing is Out-Competing
The ability of a nation to build and maintain an HPC ecosystem is an indicator of its innovation capacity. We saw this play out over the last 20 years in the global race toward exascale systems.
To lead in this race, we must treat diverse hardwares like GPUs, QPUs, TPUs, AIUs, etc. as different “computational primitives.” By using math as the universal language to describe the problems these primitives can solve, we can apply formal methods to design truly rigorous, verifiable, and trustable workflows at scale.
This allows us to “model before we manifest.” By simulating reality before we build it, we reduce the cost of failure and increase the impact of our results. In this era, the future trajectory of society increasingly depends on the future of computation. To out-compute is to out-compete1.
Bonus: Navigating the Skepticism
During the Q&A, a student asked whether we should believe the quantum skeptics. My answer is that we must distinguish between “good-faith” and “silly” skepticism.
Arguments about whether undiscovered mesoscopic physics might emerge to disrupt quantum states as we scale are intellectually rigorous and necessary. We know the world transitions from quantum to classical at macroscopic scales; understanding that boundary is vital as we push toward Fault-Tolerant Quantum Computing (FTQC). As I’ve noted previously, recent roadmaps suggest this milestone is increasingly tethered to engineering reality rather than just theory.
Conversely, arguments that suggest we simply “cannot control” high-dimensional quantum states often ignore the actual realities of quantum information theory, quantum error correction, etc.
The Architecture of Flourishing: Beyond the Bit
Reflecting on the last nearly thirty years, it is clear that the rise of the internet didn’t just change communication; it led to a fundamental reshaping of the human experience. We have seen how new computational capabilities do not just sit on top of society—they weave into its very fabric.
Because the future of our society increasingly depends on the future of computation, we cannot afford to let this new computational substrate be built in a vacuum. To ensure this transition serves us well, there must be a deep and sustained engagement between the “builders”—the scientists and engineers—and the “stewards”—the policymakers, entrepreneurs, and managers.
As we continue to build this new substrate of science, our goal must be to move beyond mere efficiency. We must intentionally architect these systems in ways that promote true human flourishing/dignity and serve the common good. Only by bridging the gap between technical innovation and philosophical intent can we ensure that as we out-compute our challenges, we do not lose sight of the humanity we are computing for.
Was this article useful? Please share it.
P.S. In addition to The Quantum Stack, you can find me online here.
NOTE: All opinions and views expressed are my own, are not representative or indicative of those of my employer, and in no way are intended to indicate prospective business approaches, strategies, and/or opportunities.
Copyright 2026 Travis L. Scholten. All rights reserved (to the degree to which they can be).
Note that I found the phrase “to out compute is to out-compete” in this 2022 report from Hyperion Research, which itself references this concept from the US Council on Competitiveness.



