In brief

In brief

  • After 40 years of research, the industrialization of quantum computing has begun to transform Data Analytics & AI with benefits expected in 2–3 years.
  • Computations on data in quantum systems promise new analytics opportunities and exponential speed gains for the most challenging business problems.
  • Despite challenges to scale quantum systems and integrate them in business data workflows, the industry progresses fast towards enterprise readiness.
  • Businesses engaging in quantum incubation & readiness programs today can strategically position themselves for future disruptions.


A new era of information processing

We live in a world of accelerating growth of the global digital economy, where digitalization of information creates big data assets and digital technologies become more powerful, accessible, and cheaper. This is reflected in entirely new business operating models and ways we interact, work, shop, and receive services. This observable acceleration and technological evolution are the result of complementary advances, from the internet, cloud and edge computing, mobile electronics to open-source collaborations. Overall, the backbone and key enabler of this progression is the semiconductor industry.

Based on several physics and engineering breakthroughs and the invention of the transistor in the first half of the 20th century, the industry has rapidly evolved from special- to general-purpose computing technologies and consistently followed the performance predictions of Moore’s Law, doubling the number of transistors in integrated computing circuits about every two years. This involved shrinking transistors to currently about 2 nanometers in size, which represents only a few atoms next to each other. However, at this level, transistors reach quantum physical limits, noise challenges, and the fabrication process becomes uneconomical.

As a result, the industry reacted with new 3D hybrid stacking designs, the advancement of application-specific integrated circuits for neuromorphic computing targeting Artificial Intelligence (AI) applications, and the opening of an entire new era of information processing: Quantum Computing (QC).

Tremendous progress has been made since Richard Feynman sparked the idea of QC over 40 years ago, but particularly various breakthroughs over the last 4 years have made this new computing era a near-term reality for relevant business applications.

From bit to qubit

The way we represent and process information, from the Abacus for arithmetic calculations in ancient times to transistors in the latest iPhone, has not changed much on a fundamental level: computations are deterministic in nature, and their logic follows our macroscopic understanding of the world. But the paradigm of information representation and processing in QC follows the quantum mechanical laws at atomic scales, which is fundamentally different from the world we are used to.

While the smallest unit of information in a classical computer is a bit, a binary digit deterministically represented as either “0” or “1”, the nearest equivalent unit in a quantum computer is the qubit, a two-state quantum system probabilistically represented as a coherent superposition of both “0” and “1”.

There are many physical representations of qubits that can be manipulated for controlled computation, such as using state encodings in an atom, the spin of an electron, the polarization of a photon, or more complex systems. The rules of the game for processing quantum information are anything but intuitive. For example, reading one page at a time of a 100-page digital book written in bits tells us 1% more about the book’s information content, unlike if the book were written in entangled qubits, where information is encoded in the correlations among the pages, thus requiring a collective observation of all 100 pages at once to retrieve the book’s information content. But there are exponentially more bits needed to describe the quantum state of qubits.

In 2019, Google’s 53-qubit device named Sycamore performed a calculation that would have taken the most powerful supercomputer on earth at least a few days to solve in just three minutes!

The noise around quantum computing

This exponential scaling behavior in QC gives rise to great economical interest in using this technology to solve complex high-dimensional optimization problems much faster or to tackle those that are even impossible with today’s semiconductor technology. Tremendous progress has been made since Richard Feynman sparked the idea of QC over 40 years ago, but particularly various breakthroughs over the last 4 years have made this new computing era a near term reality for relevant business applications.

One of those has been the introduction demonstration of quantum supremacy with Google’s 53-qubit device named Sycamore in 2019, performing a calculation in 3 minutes that would have taken the most powerful supercomputer on earth at least a few days. While this calculation admittedly had no practical use except demonstrating that classical systems cannot simulate quantum systems efficiently, it marked the opening of the Noisy Intermediate-Scale Quantum (NISQ) era. The name highlights one of today’s main engineering challenges: protecting qubits from decoherence and induced noise from the surrounding macroscopic environment controlling them. This leads to computing error rates currently of the order of 10 and demands intelligent error correction techniques to scale QC with hundreds of thousands of qubits and thus to realize its full potential for massive high-impact applications in the future Fault-Tolerant Quantum Computing (FTQC) era.

"90% of organizations will partner with consulting companies or full-stack providers to accelerate quantum computing innovation through 2023."

– GARTNER

But even in today’s NISQ era, the QC potential is expected to begin unfolding and supercharging applications in advanced data analytics: from solutions to efficiently learn complex information patterns in vast amounts of data to the supply of a new breed of talents to cope with tomorrow’s high demand for quantum information scientists, innovators, and leaders. Regardless of how long it will take us to enter the FTQC era, it is of paramount importance for every company to prepare for a quantum-accelerated world, to upskill and foster ideation of what QC could do in their industry, to strategically position themselves with quantum use cases, and to maximize the insights and business value they can gain with hybrid quantum data analytics solutions, starting today.

Making the quantum leap

Regardless of how long it will take us to enter the FTQC era, it is of paramount importance for every company to prepare for a quantum-accelerated world, to upskill and foster ideation of what QC could do in their industry, to strategically position themselves with quantum use cases, and to maximize the insights and business value they can gain with hybrid quantum data analytics solutions, starting today.

Dr. Manuel Proissl

Senior AI Leader – Quantum Computing Strategy and R&D


Eric Dombrowski

Data Analytics and AI SPoC for Resources Industries


Kinan Halabi

Quantum Strategy Lead ASGR


Tim Leonhardt

Technology Innovation Quantum Lead - ASG

MORE ON THIS TOPIC

Untangling the future of quantum communications
Prepare for the quantum impact
How to build a quantum computing workforce

Subscription Center
Stay in the know with our newsletter Stay in the know with our newsletter