Readout: 1st Workshop on Broadly Accessible Quantum Computing
My First Foray into the World of Quantum and HPC

Welcome! This is The Quantum Stack. Created by a Travis L. Scholten, it brings clarity to the topic of quantum computing, with an aim to separate hype from reality and help readers stay on top of trends and opportunities. To subscribe, click the button below.
A couple of weeks ago I participated in the 1st Workshop on Broadly Accessible Quantum Computing, organized by the National Center for Supercomputing Applications, NERSC, Quera Computing, and Transform Computing.
The purpose of this workshop was to bring together the quantum computing and HPC communities, to start to build connections and raise mutual awareness across these two disciplines:
…this workshop will cover how universities, supercomputing centers, national labs, and industry players are leveraging quantum resources for research and approaching the integration of quantum and classical resources in the context of HPC+AI.
Achieving this purpose was well-reflected in the workshop’s agenda, which featured a mix of panels and speakers from both the quantum and HPC communities.
A special shout-out in the agenda to Ananya Tadepalli, a high schooler who shared some work she’s been studying the impact of the use of quantum-safe cryptography on website response times!
Take-Aways
There are 4 key take-aways I left the workshop with:
Speak HPC lingo
Find useful HPC-relevant applications
The need for a quantum-ready HPC community
Beware misunderstandings on energy consumption
Many of them relate to how gaps between the quantum computing and HPC communities need to be bridged.
Take-Away 1: Speak HPC Lingo
The quantum community needs to “speak” HPC, not teach the HPC community quantum mechanics.
When discussing quantum computing, the temptation to go back to the basics always exists. While this is useful for physicists (and, to varying degrees, computer scientists), HPC practitioners are much more focused on what computers can do and are used for. As such, how the HPC community is engaged should focus more on the possible applications of quantum computers.
This is perennial problem for the industry, not unique to HPC. So any advances made in how quantum computing is discussed or explained would translate over.
Take-Away 2: Find useful HPC-relevant applications
To help with the previous point, the quantum community should get a better handle on the typical kinds of workloads at HPC centers, how the nature of an HPC center impacts the kinds of workloads it runs, and the kinds of topics in the air of HPC communities.
There are a diversity of kinds of HPC centers. For example, one of the host institutions – the National Center for Supercomputing Applications – is a key institution in the Illinois and Midwest region, while another host – NERSC – is one of the 3 leadership computing facilities at the US Department of Energy. One of the speakers from Arizona State University, Gil Speyer, represented the University’s research computing facilities, which support the work of the faculty, students, and employees at ASU. I say all of this to highlight that different kinds of HPC centers serve different communities, and so the kinds of workloads run at those centers will be different.
The quantum community needs to develop a more sophisticated understanding of what HPC workloads are (what kinds of problems, at what scales, requiring what kinds of compute/algorithms), so that when discussions of how quantum could accelerate HPC arise, a more thoughtful and nuanced answer can be given. There have been at least 2 papers recently which start to help bridge that gap, including one from Los Alamos National Laboratory, Potential Applications of Quantum Computing at Los Alamos National Laboratory, as well as one from the IBM Quantum Technical Working Group on Materials Science, Quantum-centric supercomputing for materials science: A perspective on challenges and future directions.
Hopefully increased contact between these two communities leads to more such papers and perspectives. Achieving that contact is going to require concerted efforts, though. At this time, a broad-based adoption of quantum by the HPC community cannot take place without more (helpful) evangelization from the quantum community, and more hands-on-keyboard experience by the HPC community.
Take-Away 3: The need for a quantum-ready HPC community
This leads to the third take-away; namely, the HPC community needs to get quantum ready, because (a) HPC is going to be needed to support the development of large-scale quantum computers, and (b) the combination of current-state quantum computers and HPC capabilities is already starting to be explored in the support of developing new quantum algorithms. The former is necessary because large-scale, error-corrected quantum computers are going to need some HPC capabilities to support error correction. The latter is exemplified in, for example, the recent work Chemistry Beyond Exact Calculations on a Quantum-Centric Supercomputer which leveraged an IBM Quantum system – along with the Fugaku supercomputer (at the time of writing, #4 on the Top500 list of the most powerful computers in the world) – to develop and validate new quantum algorithms for chemistry.
How the community gets ready depends on how integrated (and abstracted) quantum computers are from HPC systems. As far as I am aware, no standard orthodoxy or nomenclature exists on this, so I’ll use that depicted in the figure below. To me, this nomenclature reflects how quantum computers go from being conceived of as independent, standalone systems to becoming essentially “invisible” in the context of HPC. Overtime, the amount of quantum knowledge required to use quantum + HPC decreases as the level of abstraction increases.

(Pictograms sourced from the IBM Design Language Library.)
Without getting into quibbles about where any one partnership between an HPC provider and a quantum computing company falls on this spectrum, the essential point is that right now we’re much closer to the left-hand side than the right. Consequently, the amount of quantum domain expertise required is quite high. This leads to the tension highlighted in the first point – not every HPC user should need to have a PhD to use quantum computers, but the right level of even beginning abstraction necessary to make a meaningful dent in education and upskilling is not so clear. Over time though, especially through partnerships and collaborations, the requisite abstraction layers are built, thereby unlocking more relevance for a wider class of end-users.
In addition, one aspect of HPC getting quantum ready is that of the need for operators of HPC facilities to really understand what infrastructure and capital requirements they would need to support to bring quantum computing to that facility. Different qubit modalities come in different form factors, and that needs to be taken into consideration by these facilities. It will be very interesting to see how the planned integration of an IBM Quantum System Two within the Fugaku supercomputer goes in that regard.
Take-away 4: Beware misunderstandings on energy consumption
Finally, the fourth take-away concerns a developing narrative which impacts HPC, quantum, and AI – that of perceived energy walls/limits and the notion of sustainable computing. This take-away really hit home for me during the panel I did with Nicholas Harrigan of Nvidia. Several questions came up regarding the energy consumption of quantum computing couched in the context of the dramatic rise in the amount of power used to train and inference advanced AI models and the proliferation of data centers. I am concerned that the underlying frame of this question (“The world is using too much energy, and we’re going to run out.”) isn’t helpful for actually talking about the topic of advanced computing and sustainability, as it presumes the answer. (Namely, “reduce energy consumption”.)
In addition, a lot of work has been done on the energy demand side in terms of improving the energy efficiency of quantum computers, a topic I discussed in an article last year:
And, of course, much work continues.
Further, envisioning a future where quantum computers are as ubiquitous as data centers themselves seems quite difficult at this stage in the technology’s trajectory. Large-scale quantum computers are going to require a fair bit of space and infrastructure. Just look at PsiQuantum’s planned installation in Chicago – at 300,000 square feet, the facility will be a tad over 5 football fields in surface area! A likely scenario over the coming decades is that a handful of such systems will ever be built.
What’s more, quantum computers are more good for problems which are compute-intensive, rather than data-intensive. (A point discussed in the panel.) That is, we already know that “quantum big data” problems are almost a contradiction in terms, because the input/output (I/O) and operations rates for quantum computers are much slower than those for classical ones. There’s a nice paper by Microsoft on this, Disentangling Hype from Practicality: On Realistically Achieving Quantum Advantage (see Table 1). This re-emphasizes the point that quantum computers are going to be used for certain kinds of problems, not general-purpose compute, and that large-data problems for AI, chemistry, optimization, etc. are going to be outside quantum computing’s wheelhouse.
Finally, there’s always the possibility of increasing the energy supply side. I for one am looking forward to the nuclear renaissance…assuming that America can stop continuing to lose the atomic age.
Wrap Up
Over the past several years, the topics of quantum computing and HPC have started to mix and mingle. While figuring out how these two technologies work together is still in a nascent stage, much opportunity exists for finding clever ways to leverage quantum computer as co-processors for HPC applications (and vice-versa). In the coming years, as more partnerships between HPC centers and quantum computing providers are inked and underway, I expect we’ll see a lot of interesting results showcasing the usefulness of the convergence between quantum and HPC.
Attending this workshop gave me a first-hand perspective at how HPC centers are thinking about quantum computing, and I thank the organizers for inviting me to speak.
Finally, I want to give a shout-out to the National Center for Supercomputing Applications, who did a very nice writeup of the workshop here.
P.S. In addition to The Quantum Stack, you can find me online here.
Was this article useful? Please share it.
Note: All opinions expressed in this post are my own, are not representative or indicative of those of my employer, and in no way are intended to indicate prospective business approaches, strategies, and/or opportunities.
Copyright 2024 Travis L. Scholten