What is Quantum Computing, why it is hard, and how might it impact on AI and deliver this potential AI dividend? With breakthroughs in material science driving new modes of computing capability, how does it all tie together?
The expectation is that the use of stable and reliable quantum computers could deliver an AI dividend. It's important to understand what Quantum Computing is, why it is hard, and how it may impact on AI.
What is Quantum Computing?
Let us start by stating what we mean here about Quantum Computing; after all it isn’t that long ago that such a term would be something that was only mentioned in science fiction (on the Orville they have a Quantum Drive that powers their spaceship!)
Quantum Computing is a multidisciplinary field that combines aspects of Computer Science, Physics and Mathematics and relies heavily on R&D into esoteric new materials. It exploits quantum mechanics to carry out operations or calculations. They are designed to operate at significantly faster speeds than the majority of mainstream computer science technologies.
Within Quantum Computing the basic unit of information is the qubit (or quantum bit). Although at one level it performs the same sort of function as a traditional bit, unlike a bit (which can only be in one of two states) a qubit can exist in a superposition of its two “basis” states – this can be viewed as being a state between the two basis states. A quantum computer program uses the qubits and quantum gates as the building blocks to create systems that can solve problems or perform calculations very, very quickly. These programs are often developed using languages like Python with specialized quantum libraries.
So what’s the Problem with Quantum Computers?
So, if Quantum Computers are so fast, why aren’t they already in widespread use?
The main issue is that they are hugely complex and still the subject of extensive academic and commercial research.
They have some specific technological requirements such as extremely low temperatures. This current phase, often referred to as the NISQ (Noisy Intermediate-Scale Quantum) era, means that maintaining stable qubits so that they operate reliably has proven difficult to scale. In addition, the inherent noise in these systems presents major challenges.
Finally, knowledge and experience of how to program such computers is still somewhat in its infancy.
Reliable and scalable Quantum Computing
This is the holy grail of Quantum Computing, which has historically required the very difficult and eye-wateringly expensive fine tuning using analogue controls, which to date hasn't been practical for the kind of large commercial application that would require trillions of operations on millions of qubits.
Enter Microsoft's Majorana chip
In February 2025 Microsoft announced its first quantum chip, the Majorana 1. This quantum chip aims to revolutionise the use of Quantum Computing by making it scalable and reliable.
The chip is powered by a new Topological Core architecture that Microsoft hopes will create quantum computers capable of solving real world problems and building real world applications in years rather than decades.
This Topological Core architecture exists thanks to a new type of material - topoconductors - that can ‘observe and control Majorana particles’ (aka Majorana fermions)—particles theorised nearly 90 years ago by Italian physicist Ettore Majorana.
Microsoft claim that in the same way that “the invention of semiconductors made today’s smartphones, computers and electronics possible, topoconductors and the new type of chip they enable offer a path to developing quantum systems that can scale to a million qubits."
Microsoft go on to say “This breakthrough required developing an entirely new materials stack made of indium arsenide and aluminium, much of which Microsoft designed and fabricated atom by atom.”
Quantum in the Cloud: Amazon Braket and the Ocelot Chip
While manufacturers race to perfect their core hardware, the main path for developers to access quantum computing today is via the cloud. Amazon Braket, AWS's fully managed service, stands out by offering a unified development environment and access to the most diverse range of hardware on the market.
Braket currently gives users on-demand access to:
Superconducting Qubits (from partners like Rigetti and IQM)
Trapped-Ion Qubits (from IonQ)
Neutral-Atom Qubits (from QuEra)
This platform provides essential tools like high-performance simulators and simple pay-as-you-go pricing, dramatically lowering the barrier to entry for practical quantum application development.
In February 2025, AWS also joined the hardware race by announcing its own prototype quantum chip, Ocelot. This chip uses a novel design called cat qubits to build error correction in from the ground up, with the aim of reducing the resources needed for fault-tolerance by up to 90%. Furthermore, in late 2025, Braket launched "Program Sets," a key feature for developers that packages up to 100 quantum circuits into a single task, reducing execution time for quantum machine learning (QML) algorithms like VQE by up to 24x.
Azure Quantum and the Topological strategy
This topological approach is the core of Microsoft’s commitment to achieving Fault-Tolerant Quantum Computing (FTQC). The Azure Quantum cloud platform, while also offering access to commercial hardware from partners (like IonQ and Quantinuum), is their primary delivery system for this innovation.
The goal is that a topological qubit, due to the non-local nature of its quantum information, will be inherently resistant to local environmental noise.
This intrinsic protection is expected to vastly reduce the overhead required for error correction, making the path to a stable, million-qubit system much more efficient than with current architectures.
IBM Qiskit versus Google Cirq
Regardless of the revolutionary hardware beneath the hood, the primary interface for developers is the Software Development Kit (SDK). Today, the choice often comes down to two dominant, Python-based, open-source frameworks, which offer different philosophies for circuit design.
Feature | IBM Qiskit | Google Cirq |
Provider | IBM Quantum | Google Quantum AI |
Philosophy | Abstraction-First: Qiskit’s core strength is its Transpiler, which automatically optimizes a user's circuit to fit the specific connectivity and gate set of the target machine. This makes it highly portable and beginner-friendly. | Control-First: Cirq is explicitly designed to give the programmer fine-grained, hardware-aware control over operation timing and qubit placement. This is critical for performance tuning on noisy hardware. |
Ecosystem | The largest open-source community, extensive educational resources (the Qiskit Textbook), and direct cloud access to IBM’s large, multi-system fleet. | The primary tool for Google’s internal research and execution on its proprietary superconducting processors (e.g., Sycamore, Willow). Integrated with TensorFlow Quantum for QML research. |
Course Relevance | Essential for its community size and accessibility across all levels, offering experience on a wide range of superconducting systems. | Essential for demonstrating the challenges of the NISQ era and for students focused on cutting-edge Quantum Machine Learning (QML). |
Beyond topological: the superconducting and Trapped-Ion race
The quantum landscape is a multi-path race, and Microsoft's topological approach is far from the only game in town:
Superconducting Qubits (IBM & Google): Both tech giants leverage superconducting circuits (like the transmon) cooled to near absolute zero. IBM, which provides its systems via the cloud through IBM Quantum Experience, has a clear roadmap aiming for a fault-tolerant system before 2030, employing advanced error-correction codes (like LDPC) to meet its scaling goals.
Trapped-Ion Qubits (IonQ & Quantinuum): These systems use lasers to suspend and manipulate individual atoms. They typically boast higher fidelity (lower error rates) and are considered excellent for early-stage applications. For instance, the IonQ Forte system (available on Braket) has demonstrated its utility in complex applications, including collaborations with companies like AstraZeneca for quantum-accelerated drug development.
This diverse landscape means that today's quantum engineers must be fluent in the foundational concepts—qubits, quantum gates, and entanglement—that underpin all these competing technologies.
How could Quantum Computing impact AI?
Right at the start of this blog we said that the use of stable and reliable quantum computers could deliver an AI dividend, so what could that look like? Many of the AI systems being built today require huge processing power in order to deliver their potential. At the core of these systems are processing units sometimes referred to as AI Chips.
The term AI Chip refers to a fairly broad classification of processing units from graphics processing units (GPUs), Neural Processing Units (NPUs), to field-programming gate arrays (FPGAs) as well as Application Specific Integrated Circuits (ASICs). Some of the recent discussion around the Chinese Generative AI system DeepSeek versus other (mostly American) systems such as ChatGPT was based on how many AI Chips were required to develop these systems (and there's plenty of speculation around this).
Given the current focus on AI for Large Language Models (LLMs) that power platforms such as ChatGPT, Autonomous Vehicles, Robotics and "AI in your hand" (e.g. smartphones, and wearables such as watches); this trend is only likely to increase, and it may be that Quantum Computing is integral to the next advancements in scale.
The AI Dividend and Quantum Computing
Quantum Computing is not about creating new forms of AI Chips per se or indeed hardware to run AI systems (unlike attempts back in the 80s to create machines capable of running AI languages such as the Symbolics LISP machine).
However, the very fact that Quantum Computing allows extremely fast processing may open up new approaches, new techniques and drive innovation in the AI and machine learning fields. In fact, this concept of Quantum Computing and AI combined has its own term: Quantum AI.
Those in the field hope that Quantum AI will revolutionise industries such as finance, cybersecurity and healthcare. Even now, delivery companies such as FedEx and DHL are using ML based quantum computing to find the most efficient routes to meet a given set of constraints.
UPDATE (Oct 2025): Tangible Quantum AI Results
The real-world application of Quantum AI has recently yielded powerful commercial results. In a September 2025 announcement, IBM and HSBC reported a successful experiment that used quantum computers to improve predictions for bond orders in the European corporate bond market by up to 34%.
This breakthrough, run on IBM's quantum systems, demonstrates how hybrid quantum-classical models can leverage the speed of quantum systems to uncover hidden pricing signals in complex financial data, proving a tangible competitive advantage—the very definition of the 'AI Dividend' we describe.
A note of caution
A Note of Ongoing Debate: The initial skepticism surrounding Microsoft's claims has been a major point of discussion in the QC community throughout 2025. Critics point out that the measurements published in February 2025 are still highly prone to ambiguity, as they can also be consistent with non-topological 'Andreev modes.' For the Majorana 1 to conclusively prove its value, the industry is waiting for public demonstrations of coherent quantum information processing and braiding operations—the key mechanism for topological fault-tolerance. The race is on, but the proof of concept is still a central, active debate in the field.
Unlock the potential: start your quantum journey
The future of Quantum AI, whether driven by topological qubits like the Majorana or superconducting ones from IBM, relies on engineers who can write quantum code today. Concepts like Shor's Algorithm (which factors large numbers) and the optimization challenges faced by companies like FedEx and HSBC rely on the foundational skills of quantum circuit design and error mitigation.
If you are a Developer, Engineer, or Data Scientist ready to move from hype to hands-on, enroll in our brand new Quantum Computing Fundamentals Workshop
Learn to work with qubits, entanglement, and quantum gates using Python and quantum simulator libraries. Our course syllabus covers the core principles, key algorithms (Grover's, Shor's, VQE), and how to run jobs on real cloud platforms like Amazon Braket.
In summary
Quantum Computing and in turn Quantum AI, has the potential to revolutionise the fields of AI and Machine Learning. Watch this space and see the future arrive!