Today, Microsoft announced a significant quantum breakthrough and developed our new productBuilt-in hybrid functionality in Azure Quantumavailable to the public.This new functionality enables seamless integration of quantum data and classical computing in the cloud - a first for our industry and a major step forward in our journey towards quantum data at scale. Now researchers can start developing hybrid quantum applications with a mix of classical and quantum code running on one of today's quantum machines, QuantinuumAzure Quantum.
Classical computer science has made great strides in the last century, is extraordinarily versatile, and has changed every industry. Although it keeps evolving, there are certain problems it can never solve. For computational problems that require accurate modeling of quantum physical phenomena, quantum computers will complement classical computers, creating a hybrid architecture that exploits the best features of each design.
The quantum industry has long understood that quantum computing will always be a mixture of classical computing and quantum computing. In fact, it was a major talking point during the American Physical Society's (APS) annual March meeting in Las Vegas. However, our industry is just beginning to grapple with and design the future of hybrid classical computing and quantum computingon a scalein the public cloud. At Microsoft, we design a public cloud using Azure that enables scaled quantum computing to become a reality and then seamlessly pass the resulting profound benefits on to our customers. Essentially, as part of Azure, AI, HPC and quantum are being developed together, and this integration will have implications going forward in three important and surprising ways.
1. The power of the cloud will enable scaled quantum computing
Scientists need large-scale quanta to solve some of our society's most difficult and intractable problems, such as reversing climate change and tackling food insecurity. Based on what we know today – largely through ourResource assessmentTo function, a machine that can solve such problems needs at least a million stable and controllable qubits. Microsoft makesAdvances in a machine that can handle this scaledaily.
An important part of our scaling plan is to bring our quantum machine to the cloud alongside classical supercomputers. A driving force behind this design is the fact that the power of the cloud is required to run a fault-tolerant quantum machine. Achieving error tolerance requires advanced error correction techniques, which basically means turning physical qubits into logical qubits. Although our unique topological qubit design will greatly improve the fault tolerance of our machine, advanced software and massive computing power are still required to keep the machine stable.
To achieve fault tolerance, our quantum engine will actually be integrated with peta-scale classical computing power in Azure and will be able to handle bandwidths between quantum and classical computing power exceeding 10-100 terabits per second. second. On each logic clock cycle in the quantum computer, we must do this back and forth with classical computers to keep the quantum computer "alive" and provide a reliable output solution. This throughput requirement may surprise you, but what fault tolerance means for large-scale quantum computing is that a machine must be able to perform a trillion operations and make at most one error.
To put this number in perspective, imagine that each operation is a grain of sand. For the machine to be fault tolerant, only grains of sand need to be addedfrom every grain of sand on earthcould be defective. This type of scaling is clearly only possible through the cloud, making Azure both a key enabler and differentiator in Microsoft's strategy to bring quantum to the world at scale.
2. The advent of classical computing capabilities in the cloud can now help scientists solve quantum mechanical problems
An incredible benefit of the rise of traditional public cloud services is that the power of the cloud means that scientists can now do more at a lower cost. For example, researchers from Microsoft, ETH Zurich and the Pacific Northwest National Laboratory recently presented a new automated workflow to leverage Azure scalingin quantum chemistry and materials science. By streamlining classical simulation code and bringing it to the cloud, the team reduced the cost of simulating a catalytic chemical reaction by a factor of 10. These benefits will continue to grow as classical cloud computing capabilities further develop.
We increasingly see great potential for high-performance computing and artificial intelligence to accelerate progress in chemistry and materials science. In the short term, Azure will support research and development teams with scalability and speed. And when we bring our scaled quantum machine to Azure in the long term, it will enable greater accuracy in modeling new drugs, chemicals and materials. The opportunity for progress and growth is enormous given that chemistry and materials science affect 96 percent of manufactured goods and 100 percent of humanity. The key is to move to Azure now to both accelerate progress and future-proof your investments, as Azure is home to Microsoft's and our quantum machine's incredible AI and high-performance computing capabilities at scale, today and in the future.
3. A hyperscale cloud with AI, HPC and quantum will create unprecedented opportunities for innovators
Only when a quantum machine is developed and integrated in parallel with AI supercomputers and the scale of Azure, we can achieve the greatest impact of computing. With Azure, innovators can design and run a new class of high-impact cloud applications that seamlessly combine AI, HPC and quantum technology at scale. For example, imagine the powerful applications of the future that will enable researchers at the scale of AI to sort through huge datasets, the insights of HPC to narrow down options, and the power of quantum at scale to improve model accuracy. Due to the seamless integration of HPC, AI and Quantum in Azure, these scenarios will only be possible in one application. To take advantage of this unprecedented opportunity, this deep integration with Azure must be advanced today. As we bring together HPC and AI for advanced features, we also extend the classical and quantum integration currently available.
Today, Microsoft took a significant step toward this vision by releasing our new integrated hybrid capability in Azure Quantum to the public.
The ability to co-design hybrid quantum applications using a mix of classical and quantum code will enable today's quantum innovators to create a new class of algorithms. For example, developers can now create adaptive phase estimation algorithms that take advantage of classical computations and can iterate and adapt while physical qubits are coherent. Students can begin learning algorithms without having to draw circuits and by using high-level programming constructs such as branching based on qubit measurements (if statements), looping (for...), calculations, and function calls. In addition, scientists can now more easily look for ways to promote quantum error correction at the physical level on real hardware. All in all, a new generation of quantum algorithms and protocols previously described only in scientific papers can now run elegantly on quantum hardware in the cloud. An important milestone on the road to scaled quantum computing has been reached.
Learn more about Azure Quantum
Azurblauis the place where all these innovations come together and future-proof your investments. This is where you can be quantum-ready and quantum-secure, and as the cloud scales, so do your opportunities to make an impact. Join Mark Russinovich, Azure Chief Technology Officer and Technical Fellow at Microsoft, and I as we explore the future of the cloud in an upcomingMicrosoft Quantum Innovation SeriesFall.