Re-Assessing the Benefits and Risks of Quantum Computers
Timelines are accelerating.
Welcome! This is The Quantum Stack. Created by a Travis L. Scholten, it brings clarity to the topic of quantum computing, with an aim to separate hype from reality and help readers stay on top of trends and opportunities. To subscribe, click the button below.
Update 2025/09/25: Included link to Quantum World Congress panel details.
Note: Substack indicates this article is too long for email, so you may want to read it on the app or online.
In January of last year, a few co-authors and I published the paper Assessing the Benefits and Risks of Quantum Computers. This multi-institution paper sought to bridge the gaps between technical results in quantum computing, policy makers looking to get a handle on the implications of this technology for economic and national security impact, and those cybersecurity professionals grappling with the need to prepare for a post-quantum future. In the paper, we take “economically-impactful quantum computing” to mean quantum computers whose capabilities enable advances in commerce and private industry, and “cryptographically-relevant quantum computing” to mean those whose capabilities enable the breaking of currently-used cryptography.
As a quick review, the paper identified four key levers that were accelerating the development of economically-impactful quantum computing:
the creation of new kinds of quantum algorithms,
the rise of error mitigation techniques which enable noisy quantum computers to emulate less-noisy ones,
the development of problem partitioning methods that allowed quantum computers to tackle more complex problems, and
the exploration of commercially-relevant use cases by the private sector.
The paper argued that these levers did not also accelerate the realization of cryptographically-relevant quantum computation, and that realizing such computations would require a quantum computer with capabilities exceeding those being built and envisioned for the near future. Furthermore, the paper discussed how the development of quantum safe cryptographic standards by NIST was laying the groundwork for a post-quantum future.
Given these levers and trends, the paper concluded that quantum computers are credibly expected to be capable of performing economically-impactful computations before they are capable of performing cryptographically-relevant ones.
Need a refresher on the paper? See this article:
Now, given the paper was published nearly 2 years ago, how is this conclusion holding up nowadays?
This very question was the focus of a recent panel I shared with some of my co-authors at the Quantum World Congress in DC this week.
You can view the panel here:
Spoiler alert: I think the conclusion as written is fine, and can remain unchanged. However, advances in quantum algorithms are accelerating timelines for both kinds of quantum computations. Let’s take a look as to why, through the lens of the major points the paper raised.
Revisiting The Four Key Levers
For each key lever described above, we’ll consider what’s been happening in the research literature. Rather than judging whether the paper got the impact of the lever “right” or “wrong”, I’ll be evaluating whether the paper’s perspective on that lever is one of 3 things: “Directionally correct”, “Merits Attention”, “Needs Review”.
New Quantum Algorithms
The paper originally focused solely on variational algorithms, a particular kind of quantum algorithm leveraging classical computation. These days, it’s clear that all quantum algorithms are “quantum + classical” – you’ll always need a classical computer in the loop to run a quantum computer! This is true even of those “pure” quantum algorithms such as phase estimation, Shor’s, etc. .
So the question then becomes “What parts of the problem do I need to run on quantum hardware, versus a classical processor/accelerator?”. Recently, new kinds of algorithms have been developed wherein classical high-performance computing (HPC) is leveraged to process results obtained by a quantum computer. Which, in retrospect, makes a lot of sense (because HPC is a very mature compute capability). For example new, “sample-based quantum diagonalization” algorithms that IBM has been researching are likely going to proliferate in the coming years, and highlight the crucial role HPC has to play in quantum computing.
Overall, this lever was “Directionally Accurate”. In retrospect, we could have expanded its focus to consider more generally the question of “what is a quantum algorithm?”, why quantum algorithms always involve quantum + classical compute, etc. – we didn’t need to necessarily focus solely on variational algorithms.
Error Mitigation Techniques
Error mitigation refers to techniques which allow a noisy quantum computer to emulate a less-noisy one. These techniques do so by defining a set of circuits to run on the quantum computer, along with a rule for how to classically post-process the results from that set to yield an estimate of an ideal quantum computer’s output. In this sense, error mitigation techniques ‘virtualize’ the effect of noise.
The “catch” of error mitigation is that the size of the set of circuits necessary typically grows exponentially with the number of qubits the circuit acts on. Further, the worse the noise, the more rapidly that exponential growth increases. For this reason, when error mitigation techniques first arrived on the scene, there was some skepticism about their long-term applicability.
These days, error mitigation is back in vogue as a technique that can be layered on top of quantum error correction (QEC). Recall that QEC operates a set of noisy qubits to distill a smaller number of less-noisy qubits from them. Thus, QEC decreases – but does not make zero! – the noise in a quantum computer.
This is an important point, because people sometimes overlook the fact that fault-tolerant quantum computers (FTQCs; e.g., quantum computers using QEC) still have non-zero noise, meaning there will still be a limit on the scale and complexity of the circuits they can run. While those limits are much, much larger than that the ‘bare’ quantum computer could run, they still exist. Hence, you can use error mitigation on an FTQC to emulate an even-less-noisy FTQC, and thereby expand those limits.
The really neat thing about this is efficiency. Because its overhead depends on the noise rates, error mitigation requires much less effort on an FTQC than on a ‘bare’ quantum computer. This will make using error mitigation much more practical, because fewer circuits will need to be run.
Given all this, the perspective taken in the paper was “Directionally Accurate”, and could even be strengthened in light of recent literature.
Circuit Knitting Methods
Circuit knitting refers to methods which enable problem decomposition at various levels in order to break a problem down. For example, an embedding method might break down the problem at the level of the problem itself, by identifying the components of the problem that make sense to be solved using a quantum computer. (These methods are very natural in quantum chemistry calculations, for example.) Or, a circuit cutting method can be used to break apart a many-qubit circuit into a collection of smaller-qubit circuits.
At the time of the paper, developing and using these methods was a highly active areas of study. Nowadays? Not so much. Though given that circuit knitting methods trade off quantum vs. classical compute, one could presumably argue that the quantum + HPC algorithms previously mentioned are a kind of circuit knitting at the algorithm level.
And more generally, whether the community has “moved on” from this topic – or simply the topic needs additional time to mature and find useful applications – isn’t so clear.
For these reasons, this lever “Merits Attention”.
Commercial Use Case Exploration
Without demand pull on the part of end-users, realizing meaningful economic impacts from quantum would be tremendously difficult. Over the past few years, companies have continued to build out their internal capabilities, as well as engaged the broader quantum industry to find interesting ways of solving business-relevant problems. For example, the 2024 Airbus-BMW quantum challenge saw over 100 submissions from teams looking to address relevant computational problems using quantum. That Airbus and BMW decided to host this challenge – and the strength of the response – seems a testament to interest in creating business-relevant, quantum-enabled solutions.
Further, vendors themselves have been developing programs for enterprises to get quantum ready: the IBM Quantum Accelerator program, AWS’s Quantum Embark program, and Microsoft’s Quantum Ready program – as well as similar professional services from Pasqal, Quera, IonQ, etc. – all seek to educate and upskill employees.
Finally, as a direct measure of corporate interest, this figure from MIT’s 2025 Quantum Index Report clearly shows how interest continues to grow:

As such, this lever is “Directionally Accurate”.
The Development of Quantum Safe Cryptographic Protocols
Quantum safe cryptographic protocols refers to classical methods for encrypting and decrypting data in a way that is believed to be robust against attacks by quantum computers. These protocols are based on different kinds of math than that used for today’s currently-used cryptographic protocols, and have undergone a rigorous evaluation by the National Institute of Standards and Technology (NIST) as to their ability to resist a quantum-enabled attack, as well as their practical appropriateness.
At the time of the paper, NIST was in the process of developing the first set of quantum-safe standards – that is, specific protocols and their parameters which NIST felt would provide adequate security – and those standards were expected to be released soon. The standards for 3 such protocols were released roughly 7 months after the paper went live. In addition, NIST has developed standards for 2 more protocols, one of which recently went live.
Along with these efforts, NIST had been soliciting input for its planned guidance as to when our currently-used cryptographic protocols should be disallowed, as using them will pose a cybersecurity risk. Of particular note is that by 2035 – just 1 decade from now! – NIST is indicating that the use of these protocols should be disallowed.
Given all of these advances, it’s clear the paper is out of date, and our discussion around quantum safe cryptography “Needs Review” to reflect the latest developments.
The Specter of Cryptographically-Relevant Quantum Computing
As noted at the start, the paper presented the case that cryptographically-relevant quantum computing would require a quantum computer whose capabilities exceeded those being built at the time, as well as those envisioned for the near future. Central to that case was the observation that even state-of-the-art results in quantum algorithms for cryptanalysis required prohibitive resources, as well as the hypothesis that a new, recently-introduced algorithm for factoring – Regev’s algorithm – wouldn’t materially change that observation.
Over the past few years, there have been a few advances which do seem to bring forward in time realizing cryptographically-relevant quantum computing. However, I’d say these advances do not change the central conclusion of the paper. The advances relevant for this are:
A new method for realizing “difficult” quantum gates. It turns out that, in most approaches to realizing fault-tolerant quantum computing, the largest bottleneck is generating a particular kind of quantum state. Called “magic states”, these quantum states can be used to realize a particular kind of “difficult” quantum gate: the T-gate. While the specific function of this gate isn't relevant for this article, what is relevant is that the best-known methods for realizing high-quality (i.e., low-noise) magic state, dubbed “magic state distillation protocols”, required a lot of overhead. However, a new method, “magic state cultivation”, has been developed which has substantially reduced overhead. Of course, this method will also simplify quantum algorithms for non-cryptographic purposes.
Combining improvements for Shor’s algorithm. A recent paper has leveraged several independent advancements in quantum algorithms, plus advancements in error correction and magic state cultivation, to bring down the physical estimates for cryptographically-relevant quantum computation using Shor’s algorithm. This work – How to factor 2048 bit RSA integers with less than a million noisy qubits – represents an order-of-magnitude improvement in the physical estimates. However, the paper’s author does conclude
Without changing the physical assumptions made by this paper, I see no way to reduce the qubit count by another order of magnitude. I cannot plausibly claim that a 2048 bit RSA integer could be factored with a hundred thousand noisy qubits.
And as far as Regev’s algorithm is concerned, when the paper was published much was still unknown about it. In particular, whether it would represent a practical improvement to Shor’s algorithm was very much still in question. Over the past few years, the analysis of Regev’s algorithm has continued. While the correctness of the algorithm has been proven, and its applicability to a wider range of cryptographic protocols has been extended, the current consensus in the literature is that “Shor still reigns supreme.”. Of course, Shor’s algorithm benefits from the fact it is much older, so there’s been more time for researchers to develop tricks to simplify it (see above). In time, new, Regev-specific tricks may be developed.
Overall, this topic feels “Merits Attention”, if only to make sure that the core argument still remains valid.
Quantum Computing And Cryptocurrency
This topic didn’t receive much attention in the paper (just a footnote!) but some focus by each community on the other is warranted.
This is particularly true here in the US, where both President Trump’s early executive order on digital financial technology and the passage of the GENIUS Act this summer have elevated the prominence of cryptocurrency in the national discourse around the future of finance and fintech.
Courtesy of the recently-established Project 11 organization, I recently learned that most stablecoins utilize the same underlying (quantum-vulnerable) cryptographic protocols, and that there are a variety of quantum-enabled attacks against Bitcoin. You can check out Project 11’s “Bitcoin Risq List” to see just how much BTC is in quantum-vulnerable wallets, and the current value of that BTC in USD.
Assuming the future of finance involves cryptocurrency, getting a handle on how to make them quantum-resistant would be of paramount importance. For this reason, I think the topic of quantum and cryptocurrency could use a bit more than cursory attention, and insofar as the paper discussed this topic (not much), it “Needs Review”.
Wrap-Up: Reassessing the Benefits and Risks of Quantum Computers
The main message from this re-assessment is clear: timelines for economically-impactful and cryptographically-relevant quantum computations are accelerating, and moving closer to the present. New algorithms and increased end-user sophistication are hastening the realization of economically-impactful quantum computation. Simultaneously, the urgency of getting quantum safe — coupled with what might be a difficult transition for doing so — means the specter of cryptographically-relevant quantum computing cannot be viewed as just a “distant threat”.
With respect to the paper itself, overall I’d say we hit the mark well:
We identified the right major trends (for the most part), and were clear about the “known unknowns” regarding the development of quantum safe cryptography standards and the feasibility of Regev’s algorithm as a practical competitor to Shor’s.
Our argument about why cryptographically-relevant quantum computing requires large-scale fault-tolerant quantum computers remains salient today, though we’ll likely see continued improvements at the algorithm level. I doubt the algorithms could be simplified to the point where you don’t need such computers.
New quantum + HPC algorithms, coupled with an increasing sophistication of end-users, bodes well for realizing economically-impactful quantum computing.
The topic of cryptocurrency could have been given a bit more prominence, though at the time crypto’s practical potential for economic impact was much more muted than it is now, in my opinion.
What does all this mean for end-users? A few things:
New kinds of quantum algorithms, leveraging best-in-class computational capabilities of a given piece of hardware, are going to accelerate the development of useful applications.
Error mitigation is here to stay for the foreseeable future, and will offer a capability for getting “more juice” out of current and near-future hardware.
There is a robust ecosystem of partners for end-users to collaborate with on developing pilot projects.
Getting quantum safe is of increasing importance, and organizations need to be launching projects to do so.
Clearly, staying informed about the progress in quantum computing is more crucial than ever as we move toward an increasingly quantum-enabled future!
I’m curious what you think — what are the biggest emerging risks or benefits of quantum computing? Let me know in the comments.
P.S. In addition to The Quantum Stack, you can find me online here.
Was this article useful? Please share it.
Note: All opinions expressed in this post are my own, are not representative or indicative of those of my employer, and in no way are intended to indicate prospective business approaches, strategies, and/or opportunities.
Copyright 2025 Travis L. Scholten. All rights reserved.










