Wells Fargo was introduced to quantum computing’s potential in 2019. At that time, the financial services company was exploring post-quantum cryptography and already had research relationships with academic institutions working on AI research. When IBM started work on its quantum computing research network, Wells Fargo decided to explore the possibilities.
“As we started unpacking approaches to post-quantum cryptography and what quantum generally has to offer, it became obvious that there are potential use cases where you could apply quantum to do certain transactions in an exponentially more efficient way,” says Chintan Mehta, CIO for digital technology and innovation at Wells Fargo. The team also found problems in computer science that classical computers cannot solve in a reasonable amount of time that could potentially be solved using quantum techniques. “We saw a line of sight for solving mathematical problems which would be a big amplification of productivity,” Mehta recalls.
Today, Wells Fargo is working with the MIT–IBM research group to explore and test-drive mathematical computations using quantum. Among the experiments are new approaches to vector mathematics and generalized linear algebra. One example use case: Rapid re-calculation pricing for a large book of trades done in parallel and more efficiently using the quantum computing ecosystem. Other use cases in the financial industry include leaning on quantum’s data modeling capabilities to handle the complex data structures on which fraud detection systems are built. Creaky fraud detection mechanisms partly contribute to weeks-long waiting periods for customers to be onboarded. Quantum is expected to shorten processing times dramatically.
Mehta emphasizes that Wells Fargo is focusing on the utilitarian aspects of quantum computing. “We are participating in quantum research to help validate use cases in the financial services industry that will benefit from quantum computing; we are not doing pure fundamental research, like how to build a physical quantum infrastructure for example,” he explains.
Quantum’s promise
Traditional computing is based on binary arithmetic, which transistors can easily handle. Conventional computers have made strides by cramming more transistors into ever-smaller integrated circuits.
Quantum computing upends the rules of the game by flexing something called a qubit (the quantum equivalent of a bit) which can adopt multiple states, not just the binary 0 or 1. The power of quantum systems grows exponentially, meaning that a theoretical 200-qubit system would be 2200 times as powerful as one with 100 qubits.
Quantum computers are thus able to take on problems that traditional computers simply don’t have the power to solve, such as those with complex multivariates based on probability and the modeling of what-if scenarios. This can help avoid problems like those created when driver-assistance apps suggest alternates to congested routes that then become the source of new traffic jams. Quantum-enabled traffic optimization avoids that problem by calculating all possibilities at once.
A large neural network, which is under the hood for many advanced computations across industries, depends on linear algebra calculations for training billions of nodes. “Quantum exponentially accelerates that,” Mehta says, “overall network build times go from days to minutes.”
Known unknowns
Despite quantum technology’s promise, one of the challenges of working with the technology is the many unknowns, Mehta says. “There’s still a huge gap between what we think quantum can do and what quantum can do today, especially when it comes to higher-order mathematical operations,” he says, “the stability of those operations, the expectation that humankind has around the repeatability of a computational construct is fundamentally missing from quantum today. You can run a computation multiple times and potentially get different answers each time.” That’s a source of some worry. “We’re in that space where the simulation has gone much further ahead than the actual physical quantum computer,” Mehta says.
The other unknown is whether the technology will pan out as predicted. Mehta contrasts quantum and artificial intelligence: “With AI we knew the technology works; the unknown was whether a user could be sure a specific model works in their enterprise landscape,” he says. “With quantum, the probability of failing is higher because you haven’t proven anything and there’s no common baseline against which to measure success.”
What would success look like for Wells Fargo? “It is going to be a case of milestones being hit as opposed to very specific outcomes,” Mehta says. “Can I do multiple simulations? Fourier transformations? Differential calculus? Milestones will measure when we move from very basic discrete math to more sophisticated [computations].”
The ultimate goal for Wells Fargo is “a strong library of mathematical capabilities that you can build a use case off of,” Mehta says.
Laying the groundwork
Because success in quantum computing is checking off a list systematically, Mehta advises companies to engage in a low-intensity but long-term program. “You cannot put all your discretionary spend on this, but do research consistently,” he advises. The payoff is huge but it could be a long time away.
451 Research analyst James Sanders also advises patience and playing the long game. Sanders, who is part of 451’s S&P Global Market Intelligence practice, says companies need to start building software now.
“Now is the time to start investigating what potential business problems your enterprise cannot solve today because of a lack of computing power,” Sanders says. “It’s time to dust those off the shelf and ask if these problems can be handled through quantum computing.”
That landscape assessment process is the first in a five-stage set of recommendations from technology consulting firm Capgemini on how organizations can prepare for the quantum advantage. “Assessment is a critical part of it; you need to engage data scientists and business experts to identify the problems in your industry and in your organization that cannot be solved with traditional computers and that have the potential of being solved with quantum computing,” says Satya Sachdeva, vice president of insights and data at Capgemini.
Among the firm’s other recommendations: build a small team of experts; translate the most potent use cases to small-scale quantum experiments; strike long-term partnerships with technology providers to overcome technical obstacles; and develop a long-term strategy to scale up the skills base.
Enterprises looking to get started have plenty of options, says Lian Jye Su, principal analyst at global technology intelligence firm ABI Research. “Public cloud players such as Alibaba, Amazon, Google, and IBM offer researchers services to remotely run quantum programs and experiments,” Su says. Developers can now build quantum applications using IBM’s Qiskit, Google’s Cirq, Amazon Braket, and others, which are “open-source libraries designed for optimization of quantum circuits for quantum-classical or quantum-only applications, including machine learning,” Su said. “All these services are available online.”
Early mover advantage
While experts expect quantum and classical computing to coexist long into the future with certain processes offloaded to quantum and then looped back into conventional processing, the time is now for enterprises to dive in, Sachdeva says. “It’ll be an early mover advantage and if you delay that advantage will be lost,” he says.
As for Wells Fargo, it is keeping its quantum program on a slow burn. Mehta says CIOs have to keep all advanced research projects going at the same time. “You still need to continue to improve your other non-quantum capabilities, whether it’s cloud-based specialized infrastructure or other technologies, in parallel,” Mehta says. “At some point when quantum materializes and the [computing capabilities] converge, it’s going to be extremely disruptive in a positive way.”
Now is the time to prepare for that disruption as Wells Fargo is doing.
“We believe that quantum is at a stage where classical computing was 30 to 40 years ago,” Mehta says. “That said, it will evolve much faster than anything we have seen before. Not engaging is not an option. This will be the defining technology for the future.”