Quantum Computing
Quantum computing is based on the principles of superposition and entanglement. In quantum mechanics, superposition describes the ability of a system to remain in multiple states at the same time until it is measured. Measurement in this context refers to any interaction between qubits and an external system, such as a detector or sensor, that would cause the qubits to collapse from multiple states to a single state.
To understand the concept of superposition, it's useful to imagine a coin spinning on its edge. When a coin is spinning, it is said to be in a state of superposition, meaning it is not defined as either heads or tails. However, if the coin is bumped or disturbed in any way, it will stop spinning and will end up displaying either heads or tails. In this analogy, the coin would be the qubits and the measurement would be whatever caused the coin to stop spinning.
Qubits are sensitive to interference from their environment and are usually stored at very low temperatures in computing devices to protect them from influences such as temperature fluctuations, electromagnetic fields and other particles.
For its part, entanglement describes a deep connection between 2 qubits, where the state of 1 qubit is directly dependent on the other qubit, no matter the distance between the 2.
There are 2 key properties of entanglement that all applications derived from it depend on. The 1st one, the Monogamy of Entanglement, states that entanglement between 2 qubits cannot be shared with a 3rd. The 2nd property of entanglement is called Maximal Coordination — it posits that the quantum state of a system is a combination of all the possible states that the system could be in, and that each state is measured by its probability.
Maximal Coordination is an important feature of quantum mechanics that sets it apart from classical physics, and is what allows quantum computer systems to exhibit superposition and entanglement.
History.
On May 11, 2011, D-Wave Systems announced D-Wave One, described as "the world's 1st commercially available quantum computer" operating on a 128-qubit chipset using quantum annealing (a general method for finding the global minimum of a function by a process using quantum fluctuations) to solve optimization problems. The D-Wave One was built on early prototypes such as D-Wave's Orion Quantum Computer. The prototype was a 16-qubit quantum annealing processor, demonstrated on Feb. 13, 2007, at the Computer History Museum in Mountain View, California. D-Wave demonstrated what they claimed to be a 28-qubit quantum annealing processor on Nov. 12, 2007. The chip was fabricated at the NASA Jet Propulsion Laboratory Microdevices Lab in Pasadena, California.
IBM is known for developing the 1st commercially available quantum computer, IBM Quantum System One, in Jan. 2019. In Dec. 2023, the company announced the development of a newer model, IBM Quantum System Two, which is powered by 3 Heron chips, each 1 containing 133 qubits. Quantum System Two is designed to be modular and scalable to accommodate any future advancements made in quantum computing.
Furthermore, IBM offers a software development kit called Qiskit, which is an open-source suite of software products that programmers can use to design their own algorithms and run simulations on classic computers before being executed on quantum computers. Qiskit is based on the Python programming language, making it accessible to anyone interested in learning. The newest version, Qiskit 1.3, was released in Dec. 2024.
In addition to these developments, IBM has several ambitious projects in quantum computing on its roadmap, and the company has an online platform, the Quantum Network, that gives users access to its cloud-based quantum computing services. In April 2023, EY Global Services joined the network to streamline and enhance its research capabilities.
Google is another big name in tech that's engaged in quantum computing. Its Quantum AI has been working to develop processors and algorithms since 2014, and claims to have been the 1st to achieve quantum supremacy with its quantum computer in 2019. Google's Sycamore is a transmon superconducting quantum processor created by Google's Artificial Intelligence division, which debuted Oct. 2019. It has 53 qubits. In Oct. 2019, Google AI Quantum and NASA published a paper in Nature claiming they had achieved “quantum supremacy” from an experiment involved Sycamore’s 53 superconducting qubits performing a random quantum circuit sampling task. Google reported that Sycamore completed the calculation in about 200 seconds, while they estimated it would take the world’s fastest classical supercomputer at the time (Summit at Oak Ridge) around 10,000 years.
In Dec. 2024, Google achieved a breakthrough with its Willow quantum processor when it demonstrated significantly improved error correction and scalability in quantum computing. The company’s open-source framework, Cirq, allows users to write, manipulate, optimize and run quantum circuits on quantum computers and simulators.
Microsoft has also made strides in quantum computing. Azure Quantum, a component of the company's cloud-computing service, Azure, offers resources for users who want to learn more about quantum computing, including a development kit that can be used to custom-build quantum applications. Its chatbot, a quantum-focused version of Copilot, Microsoft’s AI-powered assistant, can explain unfamiliar concepts and help users navigate the world of quantum computing more easily.
On top of that, the company has been heavily researching ways to build a scaled quantum supercomputer. In May 2023, Microsoft researchers achieved a significant milestone when they successfully created a new type of qubit that can make quantum computers more error-resistant and stable. This represents a huge breakthrough in quantum computing and physics, and Microsoft’s journey to this achievement was detailed in the June 2023 issue of Physical Review B.
In Feb. 2025, Microsoft announced the Majorana 1 chip, which it says can fit a million qubits into a chip comparable in size to a desktop PC CPU, according to The Verge. The company says it has been working on this chip, with the potential to solve large-scale problems, for 17 years. Microsoft is handling the sensitivity issue by using the Majorana particle over electrons and developing what the company called the “world’s first topoconductor.” The topoconductor can monitor and control Majorana particles, hopefully creating more stable qubits.
In July 2025, Rigetti Computing announced that its Ankaa-3 system with 4 9-qubit chips achieved a 99.5% 2-qubit gate fidelity, a measure of how accurate a calculation is through 2 processing gates.
Quantum computing startups.
Quantinuum, 1 of the leading quantum computing companies, is a spinout of Honeywell International formed through a merger of Honeywell Quantum Solutions and Cambridge Quantum in Nov. 2021. The company received significant support in early 2024 when JPMorgan Chase led a group of investors in a $300 million funding round.
More recently, PsiQuantum, a private startup firm with an estimated valuation of $3.1 billion, announced a partnership with the state of Illinois to build the largest U.S. facility conducive to quantum computing (on 7/25/2024). The California-based company is seeking to house a quantum computer containing up to 1 million qubits within the next decade. According to MIT, the largest quantum computers host around 1,000 qubits at the time. Some of the company's most notable investors are hedge fund manager Blackrock and Microsoft. There's no date yet set for an initial public offering for PsiQuantum.
Google and IBM, make the qubits out of superconducting material. IonQ makes qubits by trapping ions using electromagnetic fields. PsiQuantum is building qubits from photons. A major benefit of photonic quantum computing is the ability to operate at higher temperatures than superconducting systems.
Some types of superconducting quantum processors:
Tunable-frequency qubits, and tunable couplers | Fixed-frequency qubits, and fixed coupling |
Such as Google and USTC. | Such as RIKEN and IBM. |
Complex circuit. | Simple circuit. |
High wiring cost. | Low liring cost. |
Flux-bias necessary. | No flux bias. |
Fast 2-qubit gate. | Slow 2-qubit gate. |
Short coherence. | Long coherence. |
Avoidance of TLS collisions. | TLS collisions. |
Of the 2, which 1 is the best? Well, as of March 2025, the industry is trying to lean towards fixed-frequency qubits with tunable couplers.
Tunable coupler consists of double transmons and a Josephson junction loop between 1 fixed-frequency data transmon qubits.
-
1. How does a classical computer compute 2+2=4 and how does a quantum computer compute 2+2=4?
Classical: computes 2 + 2 = 4 deterministically using voltage-controlled binary logic gates. The CPU represents numbers as patterns of voltage (“high” = 1, “low” = 0) in transistors.
Quantum: computes 2 + 2 = 4 using reversible quantum gates, with the extra ability to add numbers that are in superpositions, producing superposed outputs. But if both inputs are definite, you still just get 4. Quantum gates (unitary operations) can be wired into a quantum ripple-carry adder or quantum Fourier transform adder.
1b. How do trapped ions quantum computer, spins of electrons quantum computer, photonics quantum computer, and superconducting circuit quantum computer, compute 2+2=4?
The difference lies in how a qubit is physically realized and how the gates are implemented. No matter the platform, if you ask the machine to compute 2 + 2, it will ultimately encode 2, apply a quantum addition circuit, and measure 4.
1. Trapped-ion quantum computers
Qubit: electronic states of individual ions (e.g., ytterbium or calcium). A laser can flip the ion between |0〉 and |1〉.
Gates:
Single-qubit gates: controlled by laser pulses that change the ion’s internal state.
2-qubit gates: use shared vibrational motion of ions in a trap.
Encode 2 as a qubit state (binary 10), use laser-driven gates to implement a reversible ripple-carry adder. When measured, the ion states collapse to the binary representation of 4 (100).
2. Spin-based (electron spins in quantum dots)
Qubit: spin-up (|↑〉) = 0, spin-down (|↓〉) = 1, in a tiny semiconductor dot.
Gates:
Magnetic fields and microwaves flip spins (single-qubit gates).
Exchange interaction between neighboring spins implements 2-qubit gates.
Spins store the binary input, gate pulses swap and entangle spins to perform logic of addition. After the circuit, the spin configuration corresponds to 100 (decimal 4).
3. Photonic quantum computers
Qubit: polarization of a photon (horizontal = |0〉, vertical = |1〉), or sometimes path encoding (photon in upper path = 0, lower path = 1).
Gates:
Beam splitters, phase shifters, and wave plates act as single-qubit gates.
2-qubit gates require nonlinear optics or measurement-based schemes.
Encode numbers in photon modes. Optical circuits perform the addition logic (usually via measurement-based quantum computing). Output photons collapse to represent 100.
4. Superconducting circuits
Qubit: tiny loops of superconducting current (Josephson junctions). States are superpositions of different current or charge states.
Gates:
Microwave pulses drive transitions between |0〉 and |1〉.
2-qubit gates use capacitive or inductive coupling between circuits.
Qubits start in a binary encoding of 2. Microwave-driven gate sequence executes the adder. The resulting qubit states collapse to 100.
Summary: all 4 platforms ultimately compute 2 + 2 = 4 by encoding numbers into qubits, running a sequence of quantum gates (a quantum addition algorithm), and measuring the output qubits.
Differences on what a qubit physically is and how gates are applied:
Trapped ions: internal electronic states, laser pulses.
Spin qubits: electron spins, magnetic/exchange interactions.
Photons: polarization/path, optical elements.
Superconducting: current/charge states, microwave pulses.
If both inputs are definite (not superposed), then all 4 machines just give 4. But their real power shows when the inputs are superpositions, giving quantum parallelism.
News.
9/24/2025.
Caltech physicists have created the largest qubit array ever assembled: 6,100 neutral-atom qubits trapped in a grid by lasers. Previous arrays of this kind contained only hundreds of qubits. "This is an exciting moment for neutral-atom quantum computing," says Manuel Endres, professor of physics at Caltech. "We can now see a pathway to large error-corrected quantum computers. The building blocks are in place." Endres is the principal investigator of the research published today in Nature. 3 Caltech graduate students led the study: Hannah Manetsch, Gyohei Nomura, and Elie Bataille.
The team used optical tweezers—highly focused laser beams—to trap thousands of individual cesium atoms in a grid. To build the array of atoms, the researchers split a laser beam into 12,000 tweezers, which together held 6,100 atoms in a vacuum chamber. "On the screen, we can actually see each qubit as a pinpoint of light," Manetsch says. "It's a striking image of quantum hardware at a large scale."
A key achievement was showing that this larger scale did not come at the expense of quality. Even with more than 6,000 qubits in a single array, the team kept them in superposition for about 13 seconds, nearly 10x longer than what was possible in previous similar arrays, while manipulating individual qubits with 99.98% accuracy. "Large scale, with more atoms, is often thought to come at the expense of accuracy, but our results show that we can do both," Nomura says. "Qubits aren't useful without quality. Now we have quantity and quality."
The team also demonstrated that they could move the atoms hundreds of micrometers across the array while maintaining superposition. The ability to shuttle qubits is a key feature of neutral-atom quantum computers that enables more efficient error correction compared with traditional, hard-wired platforms like superconducting qubits.
9/28/2025.
Harvard scientists just unveiled a system that was 10x bigger and the 1st quantum machine able to operate continuously without restarting. In a paper published in the journal Nature (9/25/2025), the team demonstrated a system of more than 3,000 quantum bits (or qubits) that could run for more than 2 hours, surmounting a series of technical challenges and representing a significant step toward building the super computers, which could revolutionize science, medicine, finance, and other fields.
"We demonstrated the continuous operation with a 3,000-qubit system," said Mikhail Lukin, Joshua and Beth Friedman University Professor and co-director of the Quantum Science and Engineering Initiative, and senior author of the new paper. "But it's also clear that this approach will work for much larger numbers as well."
The Harvard-led collaboration included researchers from MIT and was jointly headed by Lukin, Markus Greiner, George Vasmer Leverett Professor of Physics, and Vladan Vuletic, Lester Wolfe Professor of Physics at MIT. The team conducts research in collaboration with QuEra Computing, a startup company spun out from Harvard-MIT labs.