At long last, it is good to share with you a paper I’ve been working on for a while now. Consider it my “labor of love” for the quantum computing community. While The Quantum Stack is typically focused on big-picture trends in the research literature, this paper is worth highlighting given its scope, content, and conclusions. Coming in at 38 pages and 658 references, it’s a whopper!
So what is this paper? It’s called Assessing the Benefits and Risks of Quantum Computers, and takes a wide look at the literature to evaluate the relative timescales across which quantum computers may provide economic benefits or pose a risk to cryptography. This topic is especially salient these days as several companies have published forward-looking roadmaps indicating they plan to prototype small-scale, fault-tolerant quantum computers over the next ~5 years. (These roadmaps include ones published by IBM Quantum and Quera Computing.) Given these roadmaps, it becomes increasingly important for policy makers and security experts to have a handle on the anticipated trajectory across which quantum computers may develop.
Hence, this paper. The abstract provides a fairly succinct summary of the paper’s main points and conclusions:
Quantum computing is an emerging technology with potentially far-reaching implications for national prosperity and security. Understanding the timeframes over which economic benefits and national security risks may manifest themselves is vital for ensuring the prudent development of this technology. To inform security experts and policy decision makers on this matter, we review what is currently known on the potential uses and risks of quantum computers, leveraging current research literature.
The maturity of currently-available quantum computers is not yet at a level such that they can be used in production for large-scale, industrially-relevant problems, but they are not believed to currently pose security risks. We identify 2 large-scale trends -- new approximate methods (variational algorithms, error mitigation, and circuit knitting) and the commercial exploration of business-relevant quantum applications -- which, together, may enable useful and practical quantum computing in the near future.
Crucially, these methods do not appear likely to change the required resources for cryptanalysis on currently-used cryptosystems. From an analysis we perform of the current and known algorithms for cryptanalysis, we find they require circuits of a size exceeding those that can be run by current and near-future quantum computers (and which will require error correction), though we acknowledge improvements in quantum algorithms for these problems are taking place in the literature. In addition, the risk to cybersecurity can be well-managed by the migration to new, quantum-safe cryptographic protocols, which we survey and discuss.
Given the above, we conclude there is a credible expectation that quantum computers will be capable of performing computations which are economically-impactful before they will be capable of performing ones which are cryptographically-relevant.
The paper includes a motivation and summary section, which itself could have been this article. So I would encourage you to go read that. Given the strong claim in the abstract (“there is a credible expectation that quantum computers will be capable of performing computations which are economically-impactful before they will be capable of performing ones which are cryptographically-relevant”), I thought unpacking it a bit might be useful.
We spent quite a bit of time discussing how to articulate this central conclusion of the paper. One of the challenges we ran into was that, as discussed in the paper, the cybersecurity risk associated with quantum computing is essentially already manifesting itself today in the form of “harvest now, decrypt later” attacks. That is, an adversary could already be hoovering up encrypted data and store it until such a time as the encryption could be broken by a sufficiently-capable quantum computer. For a great discussion on this, see this article over on the Quantum Untangled substack.
So how do one talk about the timescales across which quantum computers provide benefits and risks? We concluded the right way to do so is by discussing the kinds of computations a quantum computer could perform. Hence the central claim of the paper relates to “economically-impactful” versus “cryptographically-relevant” computation. Under that framing, one could argue cryptographically-relevant computations are much farther out in “quasi-absolute” terms – given current understanding of the resources required and industry’s forward-looking roadmaps, at least a decade seems a plausible timeframe.
But what of “economically-impactful” computations? By when might those be realized? Sooner than cryptographically-relevant ones? Later? About the same time? It turns out that what you can prove about useful quantum computations suggests the answer is ‘about the same time’. That is, when researchers analyze the complexity of the quantum circuits needed to run non-cryptographic applications, that complexity is roughly (within 1-2 orders of magnitude) similar to the complexity needed for cryptographic ones. While usually factors of 10 or 100 would mean “there’s a big difference”, as Figure 2 of the paper shows, you can find particular instances of non-cryptographic and cryptographic applications which require roughly the same resources. Thus at first glance, it seems “economically-impactful” and “cryptographically-relevant” quantum computation would manifest itself at roughly the same time.
We undertook a pretty large survey of the literature in Section 3 to identify research trends or activities which might hasten the advent of economically-relevant quantum computations. From this survey, several themes emerged:
Commercial quantum computing. The private sector has been actively engaged in exploring the potential commercial uses of quantum computers. Implication: the end-user base is maturing, and more and more potentially-useful and economically-impactful use cases are being discovered.
Variational algorithms. An entirely new category of algorithm (namely, variational algorithms) has been developed and explored. Successfully running these algorithms does not require fault-tolerance, and many different uses of these algorithms across a wide variety of problems are being developed. These algorithms can use shallow-depth circuits. Such circuits have been shown to be useful in certain circumstances. Implication: we should be able to find interesting shallow-depth quantum circuits to run.
Error mitigation. New methods are being developed to handle the effect of noisy hardware in a way which doesn’t require fault-tolerant quantum computation. These methods let us run circuits with more gates or higher depth. Implication: even as hardware gets better, we don’t have to wait for fault-tolerance in order to successfully run complex circuits.
Circuit knitting. New approaches to executing quantum algorithms have explored the use of running many, many quantum circuits to emulate a larger-scale (or higher-depth) circuit. Implication: quantum computational problems can be broken down in a way which makes them amenable to being run on current and near-future quantum computers.
One definition, for clarification – the word “circuit” here means “a program or set of instructions which can be run on a quantum computer”, not “an electrical circuit”. I’ve been meaning to do a Quantum Quickies article on the topic, leveraging the first one on qubits. Thanks to co-author Will Zeng for emphasizing the importance of clarifying that point!
While there are certainly caveats associated with each of these themes (which I am happy to discuss with you in the comments), together they offer a pretty compelling case that near-future quantum computers should be able to be put to work for useful computational problems. What exactly those problems are is not yet entirely clear – more work with the commercial sector is needed.
Importantly – as we discuss in Section 4 – based on what is currently known about using quantum computers for cryptanalysis, the methods described above of variational algorithms, error mitigation, and circuit knitting do not appear to make it easier to tackle cryptographic problems. Phrased another way, as far as our survey could tell, the literature hasn’t (yet) found a viable path by which these methods could be used to hasten the advent of cryptographically-relevant quantum computation. We analyze the complexity of the circuits required using known quantum algorithms for cryptanalysis of the RSA and ECC cryptosystems, and conclude that they are sufficiently-complex that fault-tolerance will be necessary to run those circuits successfully.
To summarize the above points succinctly: economically-impactful quantum computations could be performed in the near-future given the advances discussed above, whereas those self-same advances do not seem to increase the likelihood of cryptographically-relevant quantum computations. Thus, it appears that the former will be successfully realized before the latter.
What’s more, the cybersecurity risk from quantum computers can be mitigated through the migration to quantum-safe cryptography (discussed in Section 5 of the paper). In that sense, while the risk is real, it is remediable in a fairly well-understood way. If your organization hasn’t gotten started with this, you really should!
In sum, this paper draws a fairly reasonable and well-informed conclusion from the research literature. We makes no promises of attaining quantum advantage over any specific timeframe (though I have my personal thoughts on that) and we definitely acknowledge the landscape of quantum algorithms is a dynamic one. This said, given what is currently known, the bolded text in the abstract above seems extremely defensible.
I would encourage you to take a look at the “Paper Motivation and Summary” section for a deeper overview of the paper. Reading the “Conclusion and Discussion” section would be good as well, as it sums up each section and provides some recommendations for various groups.