Not all cats are grey at night — comparing different qubit technologies

Jonas Araujo
May 31, 2022
9 minutes
Share this post

Sci-fi movies often drop names to sound more sciency. Take Marvel’s Ant-Man and the Wasp: In 2 hours, the word quantum is spoken (out of context) 22 times [1], but putting quantum in front of everything does not cut it. What does quantum mean after all? The answer is very simple.

A quantum (or quanta, if plural) is a unity of something. For example, you could call a water molecule a quantum of water. The word was invented to describe the smallest amount of energy exchanged in atomic processes. It is the base of the concept of discreteness, or the property of being countable, as opposed to being continuous.

Below we will see that an atom can be in the ground state or in an excited state; that a photon (a particle of light) has only two orthogonal polarizations; and that an electron has only two spin levels. It makes sense to map these 2-level systems into a binary system. This lets nature have its own quantum bits or qubits.

Like the bits we humans use in our computers, nature’s qubits have two possible values that we label as 0 and 1 and expect to compute stuff with. The sheer universality of discreteness in nature means there are many qubit types at our disposal, each with its peculiarities, advantages, and drawbacks. Examples of natural qubits are:

Electron spins

They behave like tiny magnets, in fact the smallest magnets in the universe. Its radius is at most around 10^-30 meters! [2].

Photon polarization

Photons come in two polarizations. You can also make them linearly polarized, that is, horizontally and vertically.


Atoms can be in their ground state or excited state. The excited states are reached if the atom absorbs quanta of light (photons), and can return to the ground state if they emit them.

Besides natural qubits, we have developed other microscopic systems that display this 2-level behavior. Here we talk about the main qubit properties and criteria for their use in quantum computing, and the status of the candidates for building scalable, fault-tolerant quantum computers.

Let’s go over the challenges qubits face!

Coherence versus noise

Qubits, unlike their classical siblings, can be in superposition states of 0 and 1. They can also be entangled with one another, meaning they are linked, so operations on one qubit affect another immediately, no matter how far apart they are! The power and promise of quantum computing rely on these two features combined.

Superposition and entanglement, however, are very delicate. If the qubits are not well isolated or if our manipulations are clumsy, they decohere and the system loses its quantumness. Coherence times tell us how long the qubits keep their quantumness, while noise stands for undesirable interactions of the qubits with their surroundings, among the qubits themselves, or imprecise manipulation of the system.

Ideally, we want qubits that have long coherence times, that are easy to control, and whose operation errors can be fixed. A quantum computer made with this kind of qubits is called a fault-tolerant quantum computer — all others are noisy quantum computers. Ideally, a quantum computer with ~200 error-corrected qubits can perform a Monte Carlo simulation in a few hours, while a classical supercomputer would take about a decade to run the same task [3].

Level of control

Qubits must be easy to control and respond well to the operations performed on them. Examples of operations are rotating the spin of an electron, inverting a photon’s polarization, or entangling an excited atom with another in the ground state.

Besides doing what we want, we expect qubits do no more than that, so we must somehow keep them from communicating with one another (this is called cross-talk). This is particularly difficult to do, as you cannot simply ask qubits not to communicate with their neighbors! They are as talkative as schoolchildren!


A scalable system can be increased in size with no major change in its architecture or performance losses. In practice, scalability measures how challenging it is to increase the size of the current quantum computers. To be precise, this is not a property of qubits, but of the techniques for storing and controlling them. The main difficulties are interconnected and often overlap, the main of which are insulation, control, and coherence times.

Insulation requires shielding the qubits from unwanted external interactions, often arising from thermal fluctuations from their surroundings. With few exceptions (i.e., nitrogen-vacancy centers and photons), qubits require temperatures of the order of milliKelvin, which is much colder than the interstellar temperature (around 3 Kelvin). On top of that, the components that control the qubits must operate in these extreme temperatures. For this task, cryogenic chips were developed, as regular chips do not function properly at such low temperatures [4].

These chips must enable precise qubit control, as real-case applications require at least hundreds to thousands of ideal qubits, well beyond the current technology’s reach. For these applications, the algorithms require high qubit connectivity and precise logic-gate operations. The larger the number of qubits, the more difficult it is to avoid cross-talk and to connect physically distant qubits.

The control of qubits should take place before they decohere, that is, before they lose their quantumness. This time window changes depending on the insulation and qubit properties and should be long enough to allow the logic-gate operations. The longer the coherence times, the more operations can be performed and the more useful your quantum computer is.

The different Quantum Computing Paradigms

Whatever their nature, qubits can be used in different paradigms of quantum computing. The universal models are equivalent as they can be translated into one another and are universal Turing machines, unlike non-universal models that have limited (yet often useful) applicability.

Different paradigms of quantum computing and specific models.

In practice, the operations required in the quantum computing model help us choose the qubit more suited. For example, if your child wants to know the permanent of a given matrix, you might want to use boson sampling (I already wrote an article on this subject). Because this non-universal computing model requires no connectivity between qubits, photons are a great fit.

Having in mind coherence, level of control, and scalability, let us have a look at the current qubit technologies to evaluate the pros and cons of each, and provide up-to-date specs, when available. I follow mostly the 2016 review by Gabriel Popkin in Science [5], while trying to update it whenever data is available in the manufacturers’ public documents. We start with the most mature technologies.

Also, another article dedicated to the different quantum computing paradigms and models will follow soon. Stay tuned!

The technologies

Superconducting loops — main manufacturers: Alice&Bob, IBM, Google, Rigetti, D-Wave

These qubits are based on miniature superconducting loops of current that are manipulated via microwaves. By far, it is the most popular qubit implementation. They were featured in the first announcement of quantum supremacy by Google (sic) [6].

Taking no shortcuts, the French startup Alice&Bob proposes scalable noise-resistant cat qubits. Their qubits are based on continuous-variable quantum systems and offer superior fault-tolerant capabilities.

In general, these qubits have coherence times of dozens to a hundred microseconds, considerably shorter than technologies like trapped ions (~1h) or diamond vacancies (~10 seconds). However, the operations can be performed much quicker! The maximal number of entangled qubits reported was of 65 [7], against about 20 of trapped ions-based devices [8].

Pros: Fast response times; based on already-existing semiconductor industries; advanced cryogenic control, above-average inter-qubit connectivity

Cons: short coherence times; need supercooling; qubits are subject to not being identical — tuning required!

Ion traps — main manufacturers: ionQ, Honeywell

The most remarkable feature of these qubits is their stability, with even a recent claim of ~1-hour long coherence times [9]. Their stability makes a compelling case for trapped-ion qubits, as operations can be done in ~100 microseconds [10] — orders of magnitude longer than the ~10 nanoseconds a superconducting gate operation takes [11]. Although slower than the operations on superconducting qubits, the 2-qubit gate fidelities are higher.

Their operation, on the other hand, does not require miliKelvin temperatures, at most “just” 4 Kelvin, or -269 ºC [12].

Pros: unmatched stability; no need for miliKelvin cooling; high gate fidelities, naturally identical qubits

Cons: Long gate-operation times; technology not as mature as superconducting;

Cold and Neutral atoms — main manufacturers: Pasqal, Atom Computing, QuEra

These are very similar to ion traps, the main difference is that the atoms used are electrically neutral and are held in their positions by laser tweezers. Their gate operations take about 1 microsecond, against 100 microseconds of ion-traps.

Their atomic arrays motivate their use for the simulation of many-body systems with remarkable properties, such as superconductivity and superfluidity. They also make neutron atom computers ideal to tackle optimization problems via analog computing, as opposed to gate-based.

Pros: Long coherence times; possibility of designing 3D arrays of atoms; high fidelities; naturally identical qubits; faster than ion-traps

Cons: slow operation; many lasers are needed to keep atoms in place — limited scalability

Photonics — main manufacturers: Quandela, Xanadu, PsiQuantum

There are many advantages of working with photons, among which is the fact that you can control them at room temperature, that they are easy to transport, and that they are all naturally identical. In principle, they face no decoherence, but the drawbacks are severe and the technology is still in its dawn.

Quandela is a French startup, a leader in photonics-based quantum computers, that already provides random number generators that are key for cryptographic security in classical or quantum communication networks, blockchain verification, or numerical simulations. They have exciting work in producing entangled photons and have recently released a quantum photonic development toolkit for users to test their algorithms.

Nevertheless, a photon cannot be stopped, as by definition it always moves in the speed of light. Also, they do not interact easily with one another, requiring sophisticated nonlinear operations to implement entanglement. In addition, once you detect them, they are destroyed. Hence:

Pros: no need for supercooling; much of the silicon hardware is already available; long coherence times

Cons: no entangling gates yet, so no connectivity

Diamond vacancy centers — main manufacturers: Quantum Brilliance, Diatope

The interaction between a nitrogen atom and a vacancy in a diamond lattice provides a reasonably stable (~1 second) qubit at room temperature [13]. These qubits are often used in quantum metrology protocols in ultra-sensitive measures. An example is their use in gravitometry [14]. Typical gate operations take hundreds of nanoseconds [15].

Pros: can operate at room temperature; very stable

Cons: difficult to entangle qubits

It is time, to sum up, the front runners!


In the connectivity criteria, the leader is… superconducting qubits!

While in stability, the leader is… trapped ions!

As for the level of controlsuperconducting qubits are leading the race!

But in temperature tolerance, we have a clear leader… diamond vacancy centers!

Although cold atoms compete with trapped ions in stability, their scalability faces challenges, such as the handling of laser tweezers to keep the atoms in place. Accordingly, one of the main manufacturers estimated an upper limit of ~1000 qubits [15].

The bottom line

All the leading qubit technologies have pros and cons. As the race just started, the superconducting front-runners may soon be surpassed. That means it is prudent to say that there is no clear overall winner.

It is possible that in the future more than one type of qubit technology prevails, or that different qubits may be used for different purposes. For example, a superconducting computer could calculate while photonic qubits transferred information, in line with Olivier Ezrati’s arguments raised in his book: Understanding Quantum Technologies [16].

ColibrITD’s goal of bringing quantum computing to all in the short term is possible as we develop applications compatible with multiple hardware. That is why…

ColibrITD has chosen a hardware-agnostic path as the current landscape offers multiple quantum computers. Meanwhile, we expect the large strides of research to provide fault-tolerant computers in the near future. Until then, we will continue developing software solutions for implementation on whoever is (are) the winner(s) of the quantum hardware race.

Comments and References

We have omitted the topological qubits and quantum dots for shortness and low data availability on their experimental properties.

It is worth mentioning an original approach based on carbon nanotubes. This effort is led by the French company C12, which aims to build qubits that are resistant to decoherence due to the unique properties of carbon nanotubes.

For a more business-like recap, consult:





[5] Popkin, G. (2016). Quest for qubits. Science, 354(6316), 1090–1093. doi:10.1126/science.354.6316.1090











[16] find his book in his blog or buy a paperback copy at:

Jonas Araujo

Discover the Latest Blog Posts

Stay informed with our insightful blog content.