Many experts believe that the biggest advances in computing capability are still ahead of us. For federal agencies — charged with tackling the world’s gnarliest and most pervasive problems — these fast-emerging capabilities in computational power present both an incredible opportunity as well as a significant potential threat. This trend, Computing the Impossible, explores both the positive and cautionary dimensions of the growing computing power coming into our grasp.
Over the last five decades, the state of computing has progressed at a truly astonishing rate.
Consider that in 1971, chip-maker Intel launched the Intel® 4004 processor, the first general-purpose programmable processor on the market. At that time, the 4004 was a technical marvel: the size of a small fingernail, it held 2,300 transistors — tiny electrical switches representing the 1s and 0s that are the basic binary language of computers. Each transistor on the 4004 was 10 micrometers wide — about one-tenth the thickness of a human hair.
Impressive as that was, today, we see state-of-the-art microprocessors packing 60 billion — even 80 billion — transistors; and the size of a single transistor is now approaching 2 nanometers (nm), less than the width of a single strand of human DNA.
Federal Technology Vision 2022: Computing the Impossible
Chris Copeland shares a summary of Federal Technology Vision 2022’s Trend 4: Computing the Impossible.
This trend towards ever-smaller transistors and ever-more-powerful computers has been remarkable to observe. Incredibly, it was also entirely predicted way back in 1965. That’s when Gordon Moore, then the research director at Fairchild Semiconductor Corporation, observed in a magazine article that, due to the ever-shrinking size of transistors, twice as many transistors would be able to fit onto a computer chip roughly every 12 to 18 months. This observation became known as Moore’s Law. (In 1975, Moore adjusted his estimate to a doubling of transistors every two years.)
Indeed, Moore’s observation has charted pretty accurately the trajectory of technology since then, becoming a cornerstone on which much of our recent innovation and economic growth has been built. But today, many industry leaders, including Moore himself, agree that Moore’s Law will soon — if it hasn’t already — bump into the physical and engineering limits of what is possible. The basic components driving technology today are approaching a fundamental limit of smallness: the atom (which ranges from about 0.1 to 0.5 nanometers). Getting smaller yet requires new, more innovative designs and different materials.
However, even as the blistering pace of advancement for classical or binary computing begins to cool, entirely new varieties of computers are now emerging that apply completely different approaches to the challenge of fueling tomorrow’s technological advances.
Agencies are optimistic about next-gen computing
Percentage of U.S. federal executives who say that a category of next-generation computing will have a breakthrough or transformational positive impact on their organizations in the future.
The advancement getting the most attention, because it promises to be so disruptive and transformative, is quantum computing. This fast-emerging technology harnesses the laws of quantum mechanics — the unique way that exceptionally small, subatomic particles can appear as two distinct objects at the same time — to solve problems too complex for classical computing.
With classical computing, the smallest unit of data is a bit, which has a single value of either 0 or 1. Conversely, quantum computers utilize basic units known as qubits (pronounced “cue-bits”). Its computing power derives from the potential for each qubit to be both 1 and 0 simultaneously, rather than being restricted to one or the other. This undefined quality of qubits enables quantum computers to run more complicated algorithms, tackle millions of computations simultaneously, and operate far faster than traditional computers. That’s because, as the number of qubits in a quantum computer grows, the computer becomes exponentially more powerful: two qubits hold four values, three qubits hold eight values, four qubits hold 16, and so on.
Indeed, even though widespread commercial use of quantum computers is still thought to be at least several years away, one very specific and practical application of quantum computers is already weighing heavily on the minds of cybersecurity experts around the world: the decoding of current encryption standards. Today’s cryptographic algorithms are generally impervious to classical computers, but not to quantum computers. As far back as 1994, mathematician Peter Shor demonstrated that a quantum computer would be able to quickly neutralize RSA encryption, one of the primary security standards in use today. Cyber experts even have a special name for that fateful day when quantum computers will be able to render current encryption standards useless: Q-day.
Q-day: The day when everything that runs on computer systems — our financial accounts, government secrets, power grids, transportation systems, and more — may suddenly become susceptible to quantum-powered cyberattacks.
This is a major national security concern for the U.S. government since near-peer competitors such as China and Russia are investing billions of dollars to advance their own quantum capabilities in what has become a high-stakes arms race. Already, cyber experts note that criminals and nation states have adopted an ‘intercept now, decrypt later’ strategy in which they seize and store sensitive encrypted electronic traffic now with the intention to unscramble later after they develop sufficient quantum computing capabilities.
The U.S. government is taking notice. In 2015, the National Security Agency announced its intention to transition to quantum-resistant protocols. And in 2022, President Joe Biden signed a National Security Memorandum (NSM) — titled Promoting United States Leadership in Quantum Computing While Mitigating Risks to Vulnerable Cryptographic Systems, also known as NSM-10 — that directs federal agencies to migrate vulnerable cryptographic systems to quantum-resistant cryptography as part of a multi-year effort.
While a necessary step, there is far more that needs to be done to move the country’s digital operations to new quantum-safe security standards. Any proposed replacement will need to withstand months or even years of public scrutiny and challenges before it is entrusted to protect intellectual property, financial data, and state secrets.
Quantum today is considered the pinnacle of next-generation problem solving, but there are others. High performance computers (HPC), or massively parallel processing supercomputers, are the most mature of the new categories of compute. HPCs help organizations leverage large volumes of data that may be too expensive, time-consuming, or impractical for traditional computers to handle. HPCs typically rely on different hardware and system designs — where multiple computing processors, each tackling different parts of a problem, are connected together to operate simultaneously — to solve more complex problems that involve large amounts of data. Common types of HPC system designs include parallel computing, cluster computing, and grid and distributed computing.
Scores of HPC service offerings by public and private technology and cloud service providers have already sprung up in response to exploding commercial demand and an ever-expanding menu of use cases. In financial services, for example, HPCs are helping detect fraud, analyze risk, and conduct simulations. In retail, they are used to conduct consumer profiling, inventory analysis, logistics, and revenue predictions. In life sciences, HPCs are enlisted for genome processing, molecular modeling, and pharmaceutical design. In the energy sector, they process seismic data needed to search for oil and gas deposits and conduct weather simulations needed to calculate optimal wind turbine parameters. And HPCs take on a wide array of other projects to support automotive, film, media, gaming, and aerospace companies, among others.
Another emerging category of next-generation computing relies directly on natural biological processes to store data, solve problems, or model complex systems in fundamentally different ways. At the forefront of biocomputing is data storage. One estimate predicts DNA could store an exabyte of data in just one cubic centimeter of space, with the potential to persist over 700,000 years based on biological DNA found on earth. The reliability along with the economical use of space and energy could be transformative at a time when our penchant for creating data is rapidly outpacing our ability to effectively store it. Companies are generating more data than ever, and especially in highly regulated industries like financial services, are expected to keep and store that data for long periods of time. Indeed, DNA as a solution to this problem is more than science fiction. In 2019, Microsoft became the first company to demonstrate the ability to successfully store and retrieve data in fabricated DNA.
But biological-based computing goes well beyond data-storage use cases. For example, in 2017, a team of researchers programmed human cells to obey 109 different sets of logical instructions, proving that cells can understand and execute directions correctly and consistently. With more development, this could lead to the programming of cells to fight diseases, like cancer, in more sophisticated and controlled ways. In this case, the researchers programmed cells that lacked a specific enzyme to produce a blue fluorescent protein that made them light up. Using similar approaches, cells could be programmed to light up when mixed with a patient’s blood sample, indicating that the patient has a particular disease — a much cheaper alternative to current methods requiring expensive machinery and analysis of blood samples.
Somewhat related to biocomputing is biology-inspired computing systems. Also known as biomimicry, these computers draw inspiration from biological processes and have been used in areas ranging from chip architectures to learning algorithms, and successful pilots have shown this emergent field can deliver benefits like greater power efficiency, speed, and accuracy in more complex problems. For instance, one technology at the forefront of biomimicry is neuromorphic computing. Neuromorphic chips, like Intel’s Loihi, have introduced a brand-new design to computer chips: They are modeled after the human brain. The chips use artificial neurons to transmit information in a way that is more power-efficient than traditional CPUs. Also, this architecture is optimized for the execution of Spiking Neural Networks (SNNs), a different approach to neural networks than the Artificial Neural Networks (ANNs) that power today’s AI systems. The SNN leverages simulated neurons to transmit input and output data, while an artificial synaptic layer strengthens (or weakens) the connections between each neuron – allowing the system to learn very similarly to the way the human brain operates.
Stepping back to what these machines actually let us do, consider robotics. Currently, to design autonomous or semi-autonomous robotics, engineers must decide where to put the intelligence. The machines need to be able to execute a set of instructions, but also adapt, react, and learn about their environment. One option is to place AI models out in the field (or “the edge,” as many in the Defense Department refer to it), far from a supporting data center. This is, for example, how self-driving cars learn to drive: by driving out on the road to allow their machine learning models to ingest more data and develop better responses to what they “see”. But in such cases, algorithms require extremely power-intensive GPUs. With the current limitations of batteries, power consumption becomes a significant design challenge, not just on the battery but on what can be done with the system. A library of 100 words for natural language processing, for instance, is going to be a lot less computationally intense and power hungry than a library of 2,000 words – which means power considerations will directly affect uses like human-to-machine interactions.
Another option is for AI processes to run in the cloud, but then engineers run into a different set of limitations around bandwidth and latency. No one wants a drone or a car that makes a decision half a second too late. This is where neuromorphic computing provides a clear advantage – it can run AI systems that allow for learning, more natural interaction, and more, in a power-efficient way. It opens the door to a world of robotics and edge computing that we can see from afar but have yet to attain.
And robotics and edge computing are just the beginning. As the field grows, it’s becoming clear that the human brain is particularly good at solving certain problem sets (relatively) quickly. For instance, modeling multidimensional chemical processes or solving constraint satisfaction problems are areas where brain-inspired algorithms can provide a distinct computational advantage. These advantages could be leveraged for use in waste or carbon recapture or for hyper-personalization, which many view as potential billion-dollar businesses on the horizon.
These new, non-classical varieties of computing capability will dramatically reduce the difficulty of solving some of the world’s biggest challenges. Moreover, while these advancements may lead us to faster, more efficient computing power, they shouldn’t be viewed as eventual replacements or substitutes for the classical computers we rely on today. Their use cases and capabilities will vary depending on how they are designed and architected.
Given the breakthrough technologies above, change, perhaps dramatically so, is expected. So, what will this change mean for federal agencies and the missions and business operations they support?
of U.S. federal executives agree that their organization is pivoting in response to the unprecedented computational power that is becoming available
The global race to quantum
Let’s start with quantum, which is coming at us with incredible velocity due to the intense degree of priority, investment, and geopolitical consequence attached to it. When it comes to computing the impossible, the hottest space to be in is quantum, where there is a high-stakes R&D contest under way among tech giants, startups, and governments alike. Already, it appears that quantum pioneers have eclipsed a prized milestone: “quantum supremacy,” which is when a quantum computer performs a calculation that no classical computer can perform in a reasonable amount of time.
In October 2019, Google claimed it was the first to achieve quantum supremacy when its 53-qubit quantum computer, called Sycamore, performed a task known as random circuit sampling, which involved repeating a sampling process a million times. The computer completed the task in 200 seconds — by contrast, Google claimed it would take a state-of-the-art supercomputer 10,000 years to do the same task. But Google’s claim of quantum supremacy was immediately disputed by another top rival in the quantum race, IBM, which published a paper arguing that the same calculation could be performed in 2.5 days on a classical supercomputer using an improved technique.
In 2020, China’s leading quantum research group made its own declaration of quantum supremacy. A team at the University of Science and Technology of China, in Hefei, used its system to accomplish a mathematical task in 200 seconds that it calculated would take the world’s then-most-powerful supercomputer, Japan’s Fugaku, more than 600 million years to accomplish. The Chinese system, called Jiuzhang, takes an entirely differently approach to quantum computing than Google’s Sycamore: whereas Sycamore relies on super cold, superconducting metal for its quantum circuits, the Jiuzhang’s design relies on manipulated light photons.
Ultimately, of course, the real objective is not quantum supremacy, but rather useful, practical quantum computers that can help address today’s many complex challenges such as cancer and climate change. Nevertheless, the well-publicized drama around the race to quantum supremacy illustrates the intensity and high stakes at play today — and it has set off an intense flurry of investment. In 2021, an estimated $3.2 billion was invested into quantum firms around the world, according to The Quantum Insider – an astounding increase from $900 million in 2020.
But the biggest contest in quantum is playing out at the geopolitical level where the United States and China view quantum supremacy as a top national security imperative. Military planners see huge potential for quantum to dramatically transform computing; networks; communications and cryptography; position, timing and navigation (PNT); sensors; and other foundations of the modern warfare.
The Defense Science Board (DSB), which advises Pentagon leaders on scientific and technology matters, has determined that quantum sensing is the most mature military application of quantum technologies and is currently “poised for mission use,” according to a 2022 Congressional Research Service report that summarizes the DSB findings. “For example, it could provide alternative positioning, navigation, and timing options that could in theory allow militaries to continue to operate at full performance in GPS-degraded or GPS-denied environments,” the CRS report said. Importantly, quantum’s computational power may also turbo-charge advanced AI applications, such as those needed for autonomous weapons and machine-based targeting, the CRS report added.
Pentagon planners recognize they are up against a potential adversary — China — that has made enormous progress already in developing its own quantum capabilities. By one measure, China is considerably ahead of the U.S.: China holds more than 3,000 quantum-related technology patents, about twice as many as the U.S. Moreover, in 2021, China announced it had built a 4,600 km quantum communication network, which can effectively relay quantum data between satellites and locations on earth. All told, the Chinese government has invested as much as $25 billion into quantum technology from the mid-1980s through 2022, according to one estimate.
Likewise, the U.S. government is also marshalling considerable resources to develop a wide array of quantum capabilities. Congress passed the National Quantum Initiative Act in 2018, which authorized $1.2 billion in investments over five years to help animate the White House’s National Strategic Overview for Quantum Information Science of the same year. That strategy aims to leverage quantum capabilities for national security and economic growth by developing a quantum-smart workforce, deepening the government’s engagement with the quantum industry, providing needed critical infrastructure, and advancing international cooperation. The funds authorized by the law have been creating new federal research centers, institutes, and a new U.S. National Institute of Standards and Technology (NIST)-led Quantum Economic Development Consortium (QED-C) to harness myriad efforts across industry, academia, and government.
More recently, in NSM-10, President Biden declared it U.S. policy to “maintain United States leadership in QIS [quantum information science] through continued investment, partnerships, and a balanced approach to technology promotion and protection.” Today, we already see numerous federal agencies involved in quantum R&D efforts, including NASA, the Defense Department, the National Science Foundation, the Intelligence Advanced Research Projects Activity (IARPA), the Department of Energy National Laboratories, and NIST.
One key focus area for U.S. government agencies, Biden said, is mitigating the threat of future quantum computers being able to crack the public-key cryptography that keeps today’s digital systems relatively secure. Toward that effort, NIST led an international competition and selected four ‘Post-Quantum Cryptography’ (PQC) algorithms that even quantum computers will not be able to solve that can serve as a standard for Internet security in the future. NIST expects to finalize that standard by 2024.
Optimism in next-gen computing’s potential
Percentage of U.S. federal executives who say that a category of next-generation computing has the potential to address previously unsolvable problems.
Federal use cases for quantum
But as the government works with industry and academia to expand R&D and maintain U.S. leadership in this arena, are there quantum computing (QC) service offerings and use cases available either today or in the near-term future to advance federal missions and business operations?
While some debate exists about what qualifies as general-purpose QC and when it will become readily available, more targeted cloud-based offerings are commercially available today. By focusing on more limited use cases, providers can industrialize the operating environment to achieve the stability needed to deliver reliable problem solving. These include offerings from both QC pure-plays like IBM, Rigetti, and D-Wave as well as cloud leaders AWS, Microsoft Azure and Google Cloud Platform.
“Real quantum computers exist and can be used to solve meaningful problems,” notes IDC in the analyst firm’s Worldwide Quantum Computing Forecast, 2021–2025 report. “However, the underlying technology is still not ready for large-scale production and requires exceptionally stringent operating conditions to deliver stable outcomes and only the top IT vendors and service providers can afford to build and maintain them.” IDC further points out that within the next decade, QC technology “will be closer to large-scale consumption and be suited for solving problems so complex that no amount of classical compute, even in the shape of accelerated supercomputers, could solve them.”
Many of the QC use cases today across all industry sectors involve what is known as quantum annealing, which concern the solving of discreet combinatorial optimization problems. Optimization problems search for the best of many possible combinations. This might involve, for example, finding greater efficiencies in scheduling or supply chains. Sampling problems involve building a probabilistic model of reality, typically for machine learning applications. Samples of data inform an algorithm about the model state for a given set of parameters, which can then be used to improve the model. Probabilistic models explicitly handle uncertainty by accounting for gaps in knowledge and errors in data sources. While quantum annealers are among the preferred types of QC technologies employed today, so too are quantum algorithms, cloud-based quantum computing, and quantum simulators.
Optimization challenges are particularly prevalent in the financial service and manufacturing industries, where much of today’s early QC activity is occurring. In the financial sector, companies are employing QC for credit and asset scoring, derivative pricing, portfolio management, fraud detection, investment risk analysis, and portfolio management, among others. Similar use cases would apply as well to federal agencies that are heavily financial — for example, the departments of Agriculture and Treasury, the Federal Reserve Board, the Federal Deposit Insurance Corp., and the Securities and Exchange Commission.
In manufacturing, current QC use cases include fabrication optimization and process planning, manufacturing supply chain, materials and chemistry discovery, structural design, fluid dynamics, aircraft design optimization, autonomous vehicle navigation, battery simulation, and robotics. Similar use cases would apply to fleet optimization, Defense Department agencies such as the Armed Services’ material and systems commands, the Defense Logistics Agency, and the Missile Defense Agency, among others.
But many ripe use cases exist as well in the fields of healthcare and life sciences, energy, distribution and logistics, transportation, and IT services, most of which would have relevance for federal agencies.
HPCs assist many federal mission activities
Although much of the technology headlines today are focused on quantum, it is high-performance computing (HPC) that is getting a great deal of use across the federal government. Only a decade ago, HPC was prohibitively expensive for many organizations. The cloud has helped to lower costs and dramatically broaden HPC’s appeal, just as the need for complex simulations, massive data analytics, and AI has gained considerable traction across many industry sectors.
Most large cloud vendors today include HPC-specific options among their offerings. But there are even hybrid cloud offerings, managed services, industry-focused solutions, and specialized colocation from companies such as Hewlett Packard Enterprise (HPE), IBM, and Penguin Computing. And as cloud-based HPC offerings have proliferated, so too have federal use cases and anecdotes.
The Energy Department’s National Renewable Energy Laboratory is developing its Kestrel supercomputer to answer key questions needed to advance the adoption of cleaner energy sources. Core capabilities will include applied mathematics to support the most advanced problem-solving algorithms; computational science for complex modeling; energy-efficient operational features; and advanced computer science, visualization, and data management to empower programmers.
Three federal departments joined forces to create the COVID-19 Insights Partnership, which enabled the departments of Health and Human Services and Veterans Affairs to leverage the Energy Department’s Summit supercomputer, located at Oak Ridge National Laboratory, to accelerate COVID-19 research by running large scale, complex analyses on vast amounts of health data.
One of the more high-profile examples of how federal agencies teamed up with academia and the private sector to leverage HPC to address a national crisis is the COVID-19 High Performance Computing Consortium. In the early days of the COVID-19 pandemic March 2020, the Office of Science and Technology Policy (OSTP), DoE, NSF, and IBM quickly teamed up to create a unique public-private partnership between government, industry, and academic leaders to provide COVID-19 researchers around the world no-cost access to advanced HPC and cloud computing systems and data resources along with technical expertise and support. So far, the consortium has supported more than 115 projects covering a broad spectrum of technical areas ranging from understanding the SARS-CoV-2 virus and its human interaction to optimizing medical supply chains and resource allocations.
The consortium demonstrated how the rapid availability of an advanced computing infrastructure can serve as a strategic national asset in times of crisis response, such during hurricanes, earthquakes, pandemics, and wildfires. Consequently, the White House in October 2021 publicly proposed the creation of a new National Strategic Computing Reserve, a new public-private partnership — modeled after the Civil Reserve Air Fleet and the United States Merchant Marine — that can quickly mobilize compute, software, data, and technical expertise in times of urgent need.
Many other potential use cases abound. For example, just as federal agencies have used HPCs to better understand the behaviors and qualities of the SARS-CoV-2 virus, so too could they use HPCs to better understand compounds that could create cleaner fuels or that are difficult to clean up in the environment.
Biocompute brings energy efficiency and speed to mission computing
Federal agencies are also taking active steps towards promoting bio-inspired computing. Not surprisingly, they are taking a leading role in defining key concepts and sponsoring advanced research through the work of a host of agencies, including NIST, NIH, DARPA, IARPA, NSF and the Energy Department. Going a step further, the U.S. Navy is prototyping an autonomous robot inspired by how large fish, like tuna, navigate the open seas.
However, investments in neuromorphic computing are likely to have the most immediate impact. For example, the Army Research Laboratory, Department of Defense Supercomputing Resource Center (ARL DSRC) is exploring the use of neuromorphic computing to enable low-power AI systems more suitable for field deployment.
Researchers at Sandia Labs recently published a paper showing how neuromorphic computing can surpass AI in complex problem solving. Using the random walks statistical method for neuromorphic simulations, they were able to model a number of complex scenarios, such as disease transmission, X-ray scanning, social network interactions, and financial trading, with the potential to solve these problems faster using more energy-efficient methods. “These problems aren’t really well-suited for GPUs [graphics processing units], which is what future exascale systems are likely going to rely on,” notes Brad Aimone, the report’s author. He adds, “Basically, we have shown that neuromorphic hardware can yield computational advantages relevant to many applications, not just artificial intelligence to which it’s obviously kin.”
Each one of these compute areas – quantum, HPC, and bioinspired compute – contributes to a specific niche, but taken as a whole, a clear trend emerges: We are in the midst of an evolution towards machines that, down to the very physics of their operation, are unlike any in existence today. As they grow, they will expand the window of what’s possible.
of U.S. federal executives agree that their organization’s long-term success will depend on the next-generation computing they leverage to solve the seemingly unsolvable problems not addressable by classical computing
of U.S. federal executives believe that next-generation computing has the potential to destroy their organization’s current business model
For decades, computers that could efficiently solve the world’s grand challenges have been nothing more than theoretical concepts. But enterprises can’t afford to think about them in the abstract any longer. They are rapidly improving, and their impact on our most fundamental problems and parameters may be the biggest opportunity in generations. The agencies that start anticipating a future with these machines will have the best shot at taking full advantage of that opportunity.
To learn more about how next-generation computing will affect federal agencies — including potential challenges that federal leaders should look out for and steps that agencies can take to prepare — please read the full PDF.
The annual Technology Vision takes a systematic look across the enterprise landscape to identify evolving technology trends with the highest possibility to disrupt businesses, governments, and societies over the next three years.
The full report
60 minute read
Government enters the metaverse
The Accenture Federal Technology Vision 2022 explores how the Metaverse Continuum can transform federal government.