What is quantum entanglement? A physicist explains the science of Einstein’s ‘spooky action at a distance’

quantum computer spooky action at a distance

Associate Professor of Physics, University of South Florida

Disclosure statement

Andreas Muller receives funding from the National Science Foundation.

View all partners

  • Bahasa Indonesia

Two particles connected by a bright line.

The 2022 Nobel Prize in physics recognized three scientists who made groundbreaking contributions in understanding one of the most mysterious of all natural phenomena: quantum entanglement.

In the simplest terms, quantum entanglement means that aspects of one particle of an entangled pair depend on aspects of the other particle, no matter how far apart they are or what lies between them. These particles could be, for example, electrons or photons, and an aspect could be the state it is in, such as whether it is “spinning” in one direction or another.

The strange part of quantum entanglement is that when you measure something about one particle in an entangled pair, you immediately know something about the other particle, even if they are millions of light years apart. This odd connection between the two particles is instantaneous, seemingly breaking a fundamental law of the universe . Albert Einstein famously called the phenomenon “spooky action at a distance.”

Having spent the better part of two decades conducting experiments rooted in quantum mechanics , I have come to accept its strangeness. Thanks to ever more precise and reliable instruments and the work of this year’s Nobel winners, Alain Aspect , John Clauser and Anton Zeilinger , physicists now integrate quantum phenomena into their knowledge of the world with an exceptional degree of certainty.

However, even until the 1970s, researchers were still divided over whether quantum entanglement was a real phenomenon. And for good reasons – who would dare contradict the great Einstein, who himself doubted it? It took the development of new experimental technology and bold researchers to finally put this mystery to rest.

A cat sitting in a box.

Existing in multiple states at once

To truly understand the spookiness of quantum entanglement, it is important to first understand quantum superposition . Quantum superposition is the idea that particles exist in multiple states at once. When a measurement is performed, it is as if the particle selects one of the states in the superposition.

For example, many particles have an attribute called spin that is measured either as “up” or “down” for a given orientation of the analyzer. But until you measure the spin of a particle, it simultaneously exists in a superposition of spin up and spin down.

There is a probability attached to each state, and it is possible to predict the average outcome from many measurements. The likelihood of a single measurement being up or down depends on these probabilities, but is itself unpredictable .

Though very weird, the mathematics and a vast number of experiments have shown that quantum mechanics correctly describes physical reality.

A photo of Albert Einstein

Two entangled particles

The spookiness of quantum entanglement emerges from the reality of quantum superposition, and was clear to the founding fathers of quantum mechanics who developed the theory in the 1920s and 1930s.

To create entangled particles you essentially break a system into two, where the sum of the parts is known. For example, you can split a particle with spin of zero into two particles that necessarily will have opposite spins so that their sum is zero.

In 1935, Albert Einstein, Boris Podolsky and Nathan Rosen published a paper that describes a thought experiment designed to illustrate a seeming absurdity of quantum entanglement that challenged a foundational law of the universe.

A simplified version of this thought experiment , attributed to David Bohm, considers the decay of a particle called the pi meson. When this particle decays, it produces an electron and a positron that have opposite spin and are moving away from each other. Therefore, if the electron spin is measured to be up, then the measured spin of the positron could only be down, and vice versa. This is true even if the particles are billions of miles apart.

Two blue circles with an arrow pointing up and an arrow pointing down.

This would be fine if the measurement of the electron spin were always up and the measured spin of the positron were always down. But because of quantum mechanics, the spin of each particle is both part up and part down until it is measured. Only when the measurement occurs does the quantum state of the spin “collapse” into either up or down – instantaneously collapsing the other particle into the opposite spin. This seems to suggest that the particles communicate with each other through some means that moves faster than the speed of light. But according to the laws of physics, nothing can travel faster than the speed of light. Surely the measured state of one particle cannot instantaneously determine the state of another particle at the far end of the universe?

Physicists, including Einstein, proposed a number of alternative interpretations of quantum entanglement in the 1930s. They theorized there was some unknown property – dubbed hidden variables – that determined the state of a particle before measurement . But at the time, physicists did not have the technology nor a definition of a clear measurement that could test whether quantum theory needed to be modified to include hidden variables.

A photo of John Stuart Bell in front of a chalkboard.

Disproving a theory

It took until the 1960s before there were any clues to an answer. John Bell, a brilliant Irish physicist who did not live to receive the Nobel Prize, devised a scheme to test whether the notion of hidden variables made sense.

Bell produced an equation now known as Bell’s inequality that is always correct – and only correct – for hidden variable theories, and not always for quantum mechanics. Thus, if Bell’s equation was found not to be satisfied in a real-world experiment, local hidden variable theories can be ruled out as an explanation for quantum entanglement.

The experiments of the 2022 Nobel laureates, particularly those of Alain Aspect , were the first tests of the Bell inequality . The experiments used entangled photons, rather than pairs of an electron and a positron, as in many thought experiments. The results conclusively ruled out the existence of hidden variables, a mysterious attribute that would predetermine the states of entangled particles. Collectively, these and many follow-up experiments have vindicated quantum mechanics. Objects can be correlated over large distances in ways that physics before quantum mechanics can not explain.

Importantly, there is also no conflict with special relativity, which forbids faster-than-light communication . The fact that measurements over vast distances are correlated does not imply that information is transmitted between the particles. Two parties far apart performing measurements on entangled particles cannot use the phenomenon to pass along information faster than the speed of light.

Today, physicists continue to research quantum entanglement and investigate potential practical applications . Although quantum mechanics can predict the probability of a measurement with incredible accuracy, many researchers remain skeptical that it provides a complete description of reality. One thing is certain, though. Much remains to be said about the mysterious world of quantum mechanics.

  • Quantum mechanics
  • Particle physics
  • Nobel Prize
  • Theoretical physics
  • Quantum entanglement
  • Speed of light
  • Albert Einstein
  • Quantum theory
  • Chemistry Nobel Prize 2022

quantum computer spooky action at a distance

Trials Project Lead

quantum computer spooky action at a distance

Equitable Learning Advisor

quantum computer spooky action at a distance

Director, Global Digital Farm

quantum computer spooky action at a distance

Provost and Senior Vice-President, The Australian National University

quantum computer spooky action at a distance

0113942 Associate Lecturer/Lecturer in Psychology (Identified) and Lecturer/Senior Lecturer in Indigenous Health (Identified)

Astronomy Magazine logo

  • Login/Register
  • Solar System
  • Exotic Objects
  • Upcoming Events
  • Deep-Sky Objects
  • Observing Basics
  • Telescopes and Equipment
  • Astrophotography
  • Space Exploration
  • Human Spaceflight
  • Robotic Spaceflight
  • The Magazine

What is quantum entanglement? A physicist explains Einstein’s ‘spooky action at a distance’

QuantumEntanglementCat

The 2022 Nobel Prize in physics recognized three scientists who made groundbreaking contributions in understanding one of the most mysterious of all natural phenomena: quantum entanglement.

In the simplest terms, quantum entanglement means that aspects of one particle of an entangled pair depend on aspects of the other particle, no matter how far apart they are or what lies between them. These particles could be, for example, electrons or photons, and an aspect could be the state it is in, such as whether it is “spinning” in one direction or another.

The strange part of quantum entanglement is that when you measure something about one particle in an entangled pair, you immediately know something about the other particle, even if they are millions of light years apart. This odd connection between the two particles is instantaneous, seemingly breaking a fundamental law of the universe . Albert Einstein famously called the phenomenon “spooky action at a distance.”

Having spent the better part of two decades conducting experiments rooted in quantum mechanics , I have come to accept its strangeness. Thanks to ever more precise and reliable instruments and the work of this year’s Nobel winners, Alain Aspect , John Clauser and Anton Zeilinger , physicists now integrate quantum phenomena into their knowledge of the world with an exceptional degree of certainty.

However, even until the 1970s, researchers were still divided over whether quantum entanglement was a real phenomenon. And for good reasons – who would dare contradict the great Einstein, who himself doubted it? It took the development of new experimental technology and bold researchers to finally put this mystery to rest.

Quantumcat

Existing in multiple states at once

To truly understand the spookiness of quantum entanglement, it is important to first understand quantum superposition . Quantum superposition is the idea that particles exist in multiple states at once. When a measurement is performed, it is as if the particle selects one of the states in the superposition.

For example, many particles have an attribute called spin that is measured either as “up” or “down” for a given orientation of the analyzer. But until you measure the spin of a particle, it simultaneously exists in a superposition of spin up and spin down.

There is a probability attached to each state, and it is possible to predict the average outcome from many measurements. The likelihood of a single measurement being up or down depends on these probabilities, but is itself unpredictable .

Though very weird, the mathematics and a vast number of experiments have shown that quantum mechanics correctly describes physical reality.

Two entangled particles

Eisteinportrait

The spookiness of quantum entanglement emerges from the reality of quantum superposition, and was clear to the founding fathers of quantum mechanics who developed the theory in the 1920s and 1930s.

To create entangled particles you essentially break a system into two, where the sum of the parts is known. For example, you can split a particle with spin of zero into two particles that necessarily will have opposite spins so that their sum is zero.

In 1935, Albert Einstein, Boris Podolsky and Nathan Rosen published a paper that describes a thought experiment designed to illustrate a seeming absurdity of quantum entanglement that challenged a foundational law of the universe.

A simplified version of this thought experiment , attributed to David Bohm, considers the decay of a particle called the pi meson. When this particle decays, it produces an electron and a positron that have opposite spin and are moving away from each other. Therefore, if the electron spin is measured to be up, then the measured spin of the positron could only be down, and vice versa. This is true even if the particles are billions of miles apart.

This would be fine if the measurement of the electron spin were always up and the measured spin of the positron were always down. But because of quantum mechanics, the spin of each particle is both part up and part down until it is measured. Only when the measurement occurs does the quantum state of the spin “collapse” into either up or down – instantaneously collapsing the other particle into the opposite spin. This seems to suggest that the particles communicate with each other through some means that moves faster than the speed of light. But according to the laws of physics, nothing can travel faster than the speed of light. Surely the measured state of one particle cannot instantaneously determine the state of another particle at the far end of the universe?

Physicists, including Einstein, proposed a number of alternative interpretations of quantum entanglement in the 1930s. They theorized there was some unknown property – dubbed hidden variables – that determined the state of a particle before measurement . But at the time, physicists did not have the technology nor a definition of a clear measurement that could test whether quantum theory needed to be modified to include hidden variables.

JohnBell

Disproving a theory

It took until the 1960s before there were any clues to an answer. John Bell, a brilliant Irish physicist who did not live to receive the Nobel Prize, devised a scheme to test whether the notion of hidden variables made sense.

Bell produced an equation now known as Bell’s inequality that is always correct – and only correct – for hidden variable theories, and not always for quantum mechanics. Thus, if Bell’s equation was found not to be satisfied in a real-world experiment, local hidden variable theories can be ruled out as an explanation for quantum entanglement.

The experiments of the 2022 Nobel laureates, particularly those of Alain Aspect , were the first tests of the Bell inequality . The experiments used entangled photons, rather than pairs of an electron and a positron, as in many thought experiments. The results conclusively ruled out the existence of hidden variables, a mysterious attribute that would predetermine the states of entangled particles. Collectively, these and many follow-up experiments have vindicated quantum mechanics. Objects can be correlated over large distances in ways that physics before quantum mechanics can not explain.

Importantly, there is also no conflict with special relativity, which forbids faster-than-light communication . The fact that measurements over vast distances are correlated does not imply that information is transmitted between the particles. Two parties far apart performing measurements on entangled particles cannot use the phenomenon to pass along information faster than the speed of light.

Today, physicists continue to research quantum entanglement and investigate potential practical applications . Although quantum mechanics can predict the probability of a measurement with incredible accuracy, many researchers remain skeptical that it provides a complete description of reality. One thing is certain, though. Much remains to be said about the mysterious world of quantum mechanics.

Andreas Muller , Associate Professor of Physics, University of South Florida

This article is republished from The Conversation under a Creative Commons license. Read the original article .

SN 1994D in NGC 4526 was a type Ia supernovae imaged here by the Hubble Space Telescope. Credit: NASA, ESA, The Hubble Key Project Team, and The High-Z Supernova Search Team

Supernova survey hints dark energy could be changing

Kip Thorne works at a blackboard in a screenshot taken from a promotional video for Interstellar.

Kip Thorne and the mind-bending science of Interstellar

quantum computer spooky action at a distance

How large is the universe? This Week in Astronomy with Dave Eicher

In addition to great discoveries, the JWST works with other telescopes to produce magnificent images, like this of the M74 "Phantom Galaxy." Credit: ESA/Webb, NASA & CSA, J. Lee and the PHANGS-JWST Team

The 10 greatest JWST discoveries, so far

An artistic rendering of how the ESO's Extremely Large Telescope will look on Cerro Armazones upon completion. Credit: ESO

Mirrors for the world’s largest optical telescope are on their way to Chile

An end-on view of one of the first full-energy collisions between gold ions at Brookhaven Lab's Relativistic Heavy Ion Collider (RHIC), as captured by the Solenoidal Tracker At RHIC (STAR) detector. The collisions create a quark-gluon soup that reproduces the state of the universe less than 10 microseconds after the Big Bang. The tracks indicate paths taken by thousands of subatomic particles produced in the collisions as they pass through STAR's 3-D digitial camera. Credit: BNL.

The science behind the Big Bang theory

An image of hundreds of galaxies of different sizes and shapes against a black background. A haze of teal gas-like light appears and connects some of the galaxies in the image. The teal glow is intracluster light and is focused mostly towards the middle of the image.

Vera C. Rubin Observatory to detect glowing galactic relics

Dishes of the Karl G. Jansky Very Large Array in New Mexico line up beneath the starry sky. Radio waves are just one of the numerous signals from space that comprise the soundtrack of the cosmos.

Scientists search for the soundtrack of the universe

The universe is expanding faster than predicted by popular models in cosmology. Credit: NASA

Why is the universe expanding faster than predicted? A cosmologist explains what we know

June 15, 2017

China Shatters “Spooky Action at a Distance” Record, Preps for Quantum Internet

Results from the Micius satellite test quantum entanglement, pointing the way toward hackproof global communications

By Lee Billings

quantum computer spooky action at a distance

Alfred Pasieka Getty Images

In a landmark study, a team of Chinese scientists using an experimental satellite tested quantum entanglement over unprecedented distances, beaming entangled pairs of photons to three ground stations across China—each separated by more than 1,200 kilometers. The test verifies a mysterious and long-held tenet of quantum theory and firmly establishes China as the front-runner in a burgeoning “quantum space race” to create a secure, quantum-based global communications network—that is, a potentially unhackable “quantum Internet” that would be of immense geopolitical importance. The findings were published in 2017 in Science .

“China has taken the leadership in quantum communication,” says Nicolas Gisin, a physicist at the University of Geneva, who was not involved in the study. “This demonstrates that global quantum communication is possible and will be achieved in the near future.”

The concept of quantum communications is considered the gold standard for security, in part because any compromising surveillance leaves its imprint on the transmission. Conventional encrypted messages require secret keys to decrypt, but those keys are vulnerable to eavesdropping as they are sent out into the ether. In quantum communications, however, these keys can be encoded in various quantum states of entangled photons—such as their polarization—and these states will be unavoidably altered if a message is intercepted by eavesdroppers. Ground-based quantum communications typically send entangled photon pairs via fiber-optic cables or open air. But collisions with ordinary atoms along the way disrupt the photons’ delicate quantum states, limiting transmission distances to a few hundred kilometers. Sophisticated devices called quantum repeaters—equipped with “quantum memory” modules—could in principle be daisy-chained together to receive, store and retransmit the quantum keys across longer distances, but this task is so complex and difficult that such systems remain largely theoretical.

“A quantum repeater has to receive photons from two different places, then store them in quantum memory, then interfere them directly with each other” before sending further signals along a network, says Paul Kwiat, a physicist at the University of Illinois at Urbana-Champaign, who is unaffiliated with the Chinese team. “But in order to do all that, you have to know you’ve stored them without actually measuring them.” The situation, Kwiat says, is a bit like knowing what you have received in the mail without looking in your mailbox or opening the package inside. “You can shake the package—but that’s difficult to do if what you’re receiving is just photons. You want to make sure you’ve received them, but you don’t want to absorb them. In principle, it’s possible—no question—but it’s very hard to do.”

To form a globe-girdling secure quantum communications network, then, the only available solution is to beam quantum keys through the vacuum of space, then distribute them across tens to hundreds of kilometers using ground-based nodes. Launched into low Earth orbit in 2016 and named after an ancient Chinese philosopher, the 600-kilogram Micius satellite is China’s premiere effort to do just that, as part of the nation’s $100-million Quantum Experiments at Space Scale (QUESS) program.

Micius carries in its heart an assemblage of crystals and lasers that generates entangled photon pairs, then splits and transmits them on separate beams to ground stations in its line of sight on Earth. For the latest test, the three receiving stations were located in the cities of Delingha and Ürümqi—both on the Tibetan Plateau—as well as in the city of Lijiang in China’s far southwest. At 1,203 kilometers, the geographical distance between Delingha and Lijiang was the record-setting stretch over which the entangled photon pairs were transmitted.

For now the system remains mostly a proof of concept because the current reported data-transmission rate between Micius and its receiving stations is too low to sustain practical quantum communications. Of the roughly six million entangled pairs that Micius’s crystalline core produced during each second of transmission, only about one pair per second reached the ground-based detectors after the beams weakened as they passed through Earth’s atmosphere and each receiving station’s light-gathering telescopes. Team leader Jian-Wei Pan—a physicist at the University of Science and Technology of China in Hefei who had pushed and planned for the experiment since 2003—compares the feat with detecting a single photon from a lone match struck by someone standing on the moon. Even so, he says, Micius’s transmission of entangled photon pairs is “a trillion times more efficient than using the best telecommunication fibers.... We have done something that was absolutely impossible without the satellite.” Soon, Pan says, QUESS will launch more practical quantum communications satellites.

Although Pan and his team later used Micius to distribute quantum keys between ground stations in China and Austria in 2017, enabling secure intercontinental communications, their initial demonstration instead aimed to achieve a simpler task: proving Albert Einstein wrong.

Einstein famously derided as “spooky action at a distance” one of the most bizarre elements of quantum theory—the way that measuring one member of an entangled pair of particles seems to instantaneously change the state of its counterpart, even if that counterpart particle is on the other side of the galaxy. This was abhorrent to Einstein because it suggests information might be transmitted between the particles faster than light, breaking the universal speed limit set by his theory of special relativity. Instead, he and others posited, perhaps the entangled particles somehow shared “hidden variables” that are inaccessible to experiment but would determine the particles’ subsequent behavior when measured. In 1964 physicist John Bell devised a way to test Einstein’s idea, calculating a limit that physicists could statistically measure for how much hidden variables could possibly correlate with the behavior of entangled particles. If experiments showed this limit to be exceeded, then Einstein’s idea of hidden variables would be incorrect.

Ever since the 1970s “Bell tests” by physicists across ever larger swaths of spacetime have shown that Einstein was indeed mistaken and that entangled particles do in fact surpass Bell’s strict limits. One definitive test occurred in the Netherlands in 2015, when a team at Delft University of Technology closed several potential “loopholes” that had plagued past experiments and offered slim but significant opportunities for the influence of hidden variables to slip through. That test, though, involved separating entangled particles by scarcely more than a kilometer. With Micius’s transmission of entangled photons between widely separated ground stations, Pan’s team performed a Bell test at distances 1,000 times greater. Just as before, their results confirm that Einstein was wrong. The quantum realm remains a spooky place—although no one yet understands why.

“Of course, no one who accepts quantum mechanics could possibly doubt that entanglement can be created over that distance—or over any distance—but it’s still nice to see it made concrete,” says Scott Aaronson, a physicist at the University of Texas at Austin. “Nothing we knew suggested this goal was unachievable. The significance of this news is not that it was unexpected or that it overturns anything previously believed but simply that it’s a satisfying culmination of years of hard work.”

That work largely began in the 1990s, when Pan, leader of the Chinese team, was a graduate student in the laboratory of physicist Anton Zeilinger when he was at the University of Innsbruck in Austria. Zeilinger was Pan’s Ph.D. adviser, and they collaborated closely to test and further develop ideas for quantum communication. Pan returned to China to start his own lab in 2001, and Zeilinger started one as well at the Austrian Academy of Sciences in Vienna. For the next seven years they would compete fiercely to break records for transmitting entangled photon pairs across ever wider gaps, and in ever more extreme conditions, in ground-based experiments. All the while each man lobbied his respective nation’s space agency to green-light a satellite that could be used to test the technique from space. But Zeilinger’s proposals perished in a bureaucratic swamp at the European Space Agency, whereas Pan’s were quickly embraced by the China National Space Administration. Ultimately Zeilinger chose to collaborate again with his old pupil rather than compete against him; today the Austrian Academy of Sciences is a crucial partner in the QUESS program.

“I am happy that the Micius works so well,” Zeilinger says. “But one has to realize that it is a missed opportunity for Europe and others, too.”

For years now other researchers and institutions have been scrambling to catch up, pushing governments for more funding for further experiments on the ground and in space—and many of them see Micius’s success as the catalytic event they have been waiting for. “This is a major milestone because if we are ever to have a quantum Internet in the future, we will need to send entanglement over these sorts of long distances,” says Thomas Jennewein, a physicist at the University of Waterloo in Ontario, who was not involved with the study. “This research is groundbreaking for all of us in the community—everyone can point to it and say, ‘See, it does work!’”

Jennewein and his collaborators are pursuing a space-based approach from the ground up, partnering with the Canadian Space Agency to plan a smaller, simpler satellite that could eventually act as a “universal receiver” and redistribute entangled photons beamed up from ground stations. At the National University of Singapore, an international collaboration led by physicist Alexander Ling has already launched cheap shoebox-size CubeSats to create, study and perhaps even transmit photon pairs that are “correlated”—a situation just shy of full entanglement. And in the U.S., Kwiat is using nasa funding to develop a device that could someday test quantum communications using “hyperentanglement” (the simultaneous entanglement of photon pairs in multiple ways) onboard the International Space Station.

Perhaps most significantly, a team led by Gerd Leuchs and Christoph Marquardt of the Max Planck Institute for the Science of Light in Erlangen, Germany, is developing quantum communications protocols for commercially available laser systems already in space onboard the European Copernicus and SpaceDataHighway satellites. Using one of these systems, the team successfully encoded and sent simple quantum states to ground stations using photons beamed from a satellite in geostationary orbit, some 38,000 kilometers above Earth. This approach, Marquardt explains, does not rely on entanglement and is very different from that of QUESS—but it could, with minimal upgrades, nonetheless be used to distribute quantum keys for secure communications. Their results appeared in Optica .

“Our purpose is really to find a shortcut into making things like quantum-key distribution with satellites economically viable and employable, pretty fast and soon,” Marquardt says. “[Engineers] invested 20 years of hard work making these systems, so it’s easier to upgrade them than to design everything from scratch.... It is a very good advantage if you can rely on something that is already qualified in space because space qualification is very complicated. It usually takes five to 10 years just to develop that.”

Marquardt and others suspect, however, that this field could be much further advanced than has been publicly acknowledged, with developments possibly hidden behind veils of official secrecy in the U.S. and elsewhere. It may be that the era of quantum communication is already upon us. “Some colleague of mine made the joke that ‘the silence of the U.S. is very loud,’” Marquardt says. “They had some very good groups concerning free-space satellites and quantum-key distribution at Los Alamos [National Laboratory] and other places, and suddenly they stopped publishing. So we always say there are two reasons that they stopped publishing: either it didn’t work, or it worked really well!”

Expert Voices

'Spooky action at a distance' can lead to a multiverse. Here's how.

A new reality might be produced by every possible quantum interaction.

A computer illustration of the creation of separate parallel universes as fluctuations in a quantum foam

Some interpretations of quantum mechanics propose that our entire universe is described by a single universal wave function that constantly splits and multiplies, producing a new reality for every possible quantum interaction. That's quite a bold statement. So how do we get there?

One of the earliest realizations in the history of quantum mechanics is that matter has a wave-like property. The first to propose this was French physicist Louis de Broglie, who argued that every subatomic particle has a wave associated with it, just like light can behave like both a particle and a wave .

Other physicists soon confirmed this radical idea, especially in experiments where electrons scattered off a thin foil before landing on a target. The way the electrons scattered was more characteristic of a wave than a particle. But then, a question came up: What, exactly, is a wave of matter? What does it look like?

Related: Do We Live in a Quantum World?

Early quantum theorists such as Erwin Schrödinger believed that particles themselves were smeared out over space in the shape of a wave. He developed his famous equation to describe the behavior of those waves, which is still used today. But Schrödinger's idea flew in the face of more experimental tests. For example, even though an electron acted like a wave midflight, when it reached a target, it landed as a single, compact particle, so it couldn't be physically extended in space.

Instead, an alternative interpretation began to gain ground. Today, we call it the Copenhagen interpretation of quantum mechanics, and it is by far the most popular interpretation among physicists. In this model, the wave function — the name physicists give to the wave-like property of matter — doesn't really exist. Instead, it's a mathematical convenience that we use to describe a cloud of quantum mechanical probabilities for where we might find a subatomic particle the next time we go looking for it.

Chains of entanglement

The Copenhagen interpretation has several problems, however. As Schrödinger himself pointed out, it's unclear how the wave function goes from a cloud of probabilities before measurement to simply not existing the moment we make an observation. 

So perhaps there's something more meaningful to the wave function. Perhaps it's as real as all of the particles themselves. De Broglie was first to propose this idea, but he eventually joined the Copenhagen camp. Later physicists, like Hugh Everett, looked at the problem again and came to the same conclusions.

Making the wave function be a real thing solves this measurement problem in the Copenhagen interpretation, because it stops measurement from being this super-special process that destroys the wave function. Instead, what we call a measurement is really just a long series of quantum particles and wave functions interacting with other quantum particles and wave functions.

If you build a detector and shoot electrons at it, for example, at the subatomic level, the electron doesn't know it's being measured. It just hits the atoms on the screen, which sends an electrical signal (made of more electrons) down a wire, which interacts with a display, which emits photons that hit the molecules in your eyes, and so on.

In this picture, every single particle gets its own wave function, and that's it. All of the particles and all of the wave functions just interact as they normally do, and we can use the tools of quantum mechanics (like Schrödinger's equation) to make predictions for how they'll behave.

The universal wave function

But quantum particles have a really interesting property because of their wave function. When two particles interact, they don't just bump into each other; for a brief time, their wave functions overlap. When that happens, you can't have two separate wave functions anymore. Instead, you must have a single wave function that describes both particles simultaneously. 

When the particles go their separate ways, they still maintain this united wave function. Physicists call this process quantum entanglement — what Albert Einstein referred to as "spooky action at a distance."

When we retrace all the steps of a measurement, what comes out is a series of entanglements from overlapping wave functions. The electron entangles with the atoms in the screen, which entangle with the electrons in the wire, and so on. Even the particles in our brains entangle with Earth , with all the light coming and going from our planet, all the way up to every particle in the universe entangling with every other particle in the universe.

With every new entanglement, you have a single wave function that describes all of the combined particles. So the obvious conclusion from making the wave function real is that there is a single wave function that describes the entire universe.

— New quantum paradox throws the foundations of observed reality into question

— How real is the multiverse?

— 10 mind-boggling things you should know about quantum physics

This is called the "many worlds" interpretation of quantum mechanics. It gets this name when we ask what happens during the process of observation. In quantum mechanics, we're never sure what a particle will do — sometimes it may go up, sometimes it may go down, and so on. In this interpretation, every time a quantum particle interacts with another quantum particle, the universal wave function splits into multiple sections, with different universes containing each of the different possible results.

And this is how you get a multiverse . Through the mere act of quantum particles entangling with each other, you get multiple copies of the universe created over and over again all the time. Each one is identical, save for the tiny difference in some random quantum process. That means there are multiple copies of you reading this article right now, all exactly the same except for some tiny quantum detail.

This interpretation has difficulties as well — for example, how does this splitting actually unfold? But it's a radical way to view the universe and a demonstration of just how powerful quantum mechanics is as a theory — what started as a way to understand the behavior of subatomic particles may govern the properties of the entire cosmos.

Follow us on Twitter @Spacedotcom or on Facebook .  

Join our Space Forums to keep talking space on the latest missions, night sky and more! And if you have a news tip, correction or comment, let us know at: [email protected].

Get the Space.com Newsletter

Breaking space news, the latest updates on rocket launches, skywatching events and more!

Paul Sutter

Paul M. Sutter is an astrophysicist at SUNY Stony Brook and the Flatiron Institute in New York City. Paul received his PhD in Physics from the University of Illinois at Urbana-Champaign in 2011, and spent three years at the Paris Institute of Astrophysics, followed by a research fellowship in Trieste, Italy, His research focuses on many diverse topics, from the emptiest regions of the universe to the earliest moments of the Big Bang to the hunt for the first stars. As an "Agent to the Stars," Paul has passionately engaged the public in science outreach for several years. He is the host of the popular "Ask a Spaceman!" podcast, author of "Your Place in the Universe" and "How to Die in Space" and he frequently appears on TV — including on The Weather Channel, for which he serves as Official Space Specialist.

Massive Mars dust storm spotted by China's Tianwen-1 probe (photos)

Hubble Telescope finds surprising source of brightest fast radio burst ever

NASA unveils the revolutionary X-59 Quesst 'quiet' supersonic jet 9 (photos, video)

Most Popular

By Jamie Carter January 12, 2024

By Mike Wall January 12, 2024

By Robert Lea January 12, 2024

By James Abbott January 12, 2024

By Sharmila Kuthunur January 12, 2024

By Jeff Spry January 11, 2024

By Samantha Mathewson January 11, 2024

By Mike Wall January 11, 2024

  • 2 Surprise gamma-ray discovery could shed light on cosmic mystery
  • 3 SpaceX Dragon capsule arrives at pad for Ax-3 astronaut launch (photos)
  • 4 New 'Star Trek' film will explore early years of Starfleet, Paramount reveals
  • 5 Fuel leak on ailing private Peregrine moon lander is slowing, Astrobotic says

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • My Account Login
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 21 May 2023

Testing the speed of “spooky action at a distance” in a tabletop experiment

  • Luigi Santamaria Amato   ORCID: orcid.org/0000-0003-0898-3569 1 ,
  • Deborah Katia Pallotti 1 ,
  • Mario Siciliani de Cumis   ORCID: orcid.org/0000-0003-2854-5881 1 ,
  • Daniele Dequal   ORCID: orcid.org/0000-0002-2206-5038 1 ,
  • Andrea Andrisani 1 &
  • Sergei Slussarenko 2  

Scientific Reports volume  13 , Article number:  8201 ( 2023 ) Cite this article

1216 Accesses

2 Altmetric

Metrics details

  • Quantum mechanics
  • Single photons and quantum effects

Nonlocality, probably the principal friction between Quantum Physics and Relativity, disturbed the physicists even more than realism since it looks to originate superluminal signalling, the Einsteinian “Spooky action at a distance”. From 2000 on, several tests to set lower bounds of the Spooky action at a distance velocity ( \(c \beta _{t,max}\) ) have been performed. They are usually based on a Bell Test performed in km long and carefully balanced experimental setups to fix a more and more improved bound making some assumptions dictated by the experimental conditions. By exploiting advances in quantum technologies, we performed a Bell’s test with an improved bound in a tabletop experiment of the order of few minutes, thus being able to control parameters otherwise uncontrollable in an extended setup or in long lasting experiments.

Introduction

In 1964, John Stewart Bell, through the Bell inequality formulation, devised an experimental method 1 to prove that quantum physics is incompatible with certain types of local hidden-variable theories, and solved the ancient debate triggered by Einstein–Podolsky–Rosen paradox (EPR) 2 , considered, so far, just a philosophical issue.

The inequality, later in a form adapted for experiments, the well known CHSH form 3 , allowed for the solution of the debate through an experimental measure of a parameter, usually named S, related to correlations in measurement outcomes on entangled particles.

If all system properties are defined before measurements, even if some of them remain unknown due to incompleteness of the theory, then \(|S| \le 2\) ; besides, no “Spooky Action at Distance” (SAD) is present among subsystems of an entangled state (locality condition). On the other hand, quantum theory predicts \(|S|=2\sqrt{2}\) . Bell’s work shifted the debate from epistemology to the world of experimental physics and over time, several experiments testing Bell inequality \(|S| \le 2\) were performed confirming the inequality violation in favor of quantum mechanics (QM) predictions.

Most recently, Bell test demonstrations aimed at reducing the number of assumptions made on the experimental setup by addressing possible loopholes, see 4 , 5 for comprehensive reviews. A completely loophole-free Bell test cannot exist by invoking the free will loophole (events that look causally disconnected are correlated through an event in their common past because of Big Bang). Although all the experiments confirmed QM, there is still room for a superluminal theory, in which a first event could physically, through spooky action at a distance (SAD), influences a second one, despite being space-like separated. In this case, if the usual Einstein’s clocks synchronization is adopted, such influence would need to be defined in some universal Privileged Frame (PF), in order to avoid causal paradoxes. In 1989, H. Eberhard proposed an experiment 6 to set a lower bound on SAD velocity ( \(c \beta _{t,max}\) ), assuming a given preferred frame The experiment is based on the idea that, if the speed of SAD is finite, and the detection events (A and B) are simultaneous in the privileged frame, the communication between two events does not arrive on time and Bell violation is not observed. Moreover, events A and B that are simultaneous in a frame are simultaneous in all frames moving in direction perpendicular to the line joining A-B. Eberhard proposed to optimize detection events simultaneity and to perform a Bell test over a 12 h period on a setup where the event A and B are east-west oriented in order to scan all possible reference frame orientations.

In 2000, following Eberhard idea, a 10.6 km long and nearly east-west oriented EPR experiment performed in Geneva was analyzed 7 . The results produced a value for \(c \beta _{t,max}\) of the order of \(10^4 c\) . Following works aimed at setting more stringent velocity bounds 8 , 9 and at closing the freedom of choice loophole 10 . Although experiments involving kms of photon propagation distance provided values for \(c\beta _{t,max}\) spanning from of \(10^4 c\) to \(5 \times 10^6 c\) , they were challenged by uncontrollable environment conditions, non perfect east-west alignment and days-long acquisition times, complicating further advance and scalability.

Here we perform a test of speed of “spooky action at a distance” using a simple tabletop Bell test in an east-west aligned setup. The small scale of our experiment allowed to perform simultaneity tests under controlled environmental conditions with precise characterisation of the photons properties and short acquisition time. We set a more than double (compared to 16 km long test of Ref. 10 ) improved bound on the speed of ‘spooky action at a distance’ in Cosmic Microwave Background (CMB) reference frame.

Moreover, the use of a smart high performance tabletop setup allowed to fix some questions not addressed in 8 , 9 as: the control of environmental conditions (temperature, humidity ...), the avoiding of the Bell test splitting in several days which requires the use of coincidences acquired in different days for the calculation of a given S value; the measure of photon time shape that produces uncertainty in arrival time (see Numerical estimation of \(\beta _{t,max}\) ); the use of polarization entanglement with measurement settings on each side that is more suitable for Bell test of local realism 11 , 12 ; besides, even if the present experiment lasted 11 minutes, the sharp east-west orientation adopted for the baseline will allow to consider, in a future 12 hours experiment, all the possible frames as the candidate preferred one.

As pointed out by Eberhard 6 , the issue of a finite value for the SAD velocity is not a mere philosophical question, being room for a violation of quantum mechanics predictions: more than this, it involves Special Relativity as well, as several models 13 , 14 foresee the possibility of superluminal communications in the case c<SAD velocity<infinity were demonstrated. The developed entangled photons source and, more in general, the smart experimental setup with enhanced simultaneity accuracy 15 is of great importance in technological applications with strict timing and synchronization requirement including: teleportation 16 , space mission for global quantum communiction 17 , quantum internet 18 , 19 , clock synchronisation 20 and quantum sensing 21 just to name a few.

The more and more efficient systems for generation, transmission and detection of entanglement will be more and more widespread as test bed to probe the tensions between Quantum Physics and Relativity as demonstrated by several proposed and performed experiments in the last years 22 , 23 , 24 .

We generate a nearly degenerate polarization entangled photon pairs at telecom wavelength, subsequently separated by a dichroic beam splitter. Through state preparation optics, (see Paragraph Experimental setup ) we prepare maximally entangled antisymmetric Bell state: \(\left| \psi ^- \right\rangle = \frac{1}{\sqrt{2}} \left( \left| HV \right\rangle - \left| VH \right\rangle \right)\) to be sent in opposite (east-west) directions on absorbing polarizers and then on single photon detectors for Bell test.

figure 1

Polarization-correlation measurements. Coincidence counts for A and B detectors (5 s integration time, 120 ps coincidence window) as function of \(\xi _B\) angle (polarizer orientation before detector B) for (H/V/D/A) bases corresponding to respectively \(\xi _A=(0^{\circ }/90^{\circ }/45^{\circ }/135^{\circ })\) angles of polarizer before A detector.

If the optical paths travelled by two photons are equal, a hypothetical, non-instantaneous “quantum information” generated by the first detection event A does not arrive on time to the second detection event B, if those events are almost simultaneous. This condition should avoid the violation of Bell inequality; besides, if observed in the PF, it would set a lower bound for quantum information propagation velocity given by \(\beta ^{(PF)}_{t,max}=\frac {d_{AB}}{\Delta d}\) , where \(d_{AB}\) is the spatial distance between A and B detection events, while \(\Delta d\) is the path uncertainty. If, on the other end, the experiment is performed in a laboratory frame at rest with the Earth, the unobserved Bell violation should determine a lower bound for the adimensional speed of spooky action equal to

as shown in 7 , 8 , 9 , 10 , using Lorentz transformations. Here, \(\rho = \frac {\Delta d}{d_{AB}}\) , \(\beta\) is the relative velocity, in modulus, of the PF frame with respect to the laboratory one, while \(\beta _{AB}\) is the projection of such velocity along the baseline A-B.

With \(\beta\) in most cases almost fixed, as we will see in Paragraph   CMB frame velocity , one has to set the parameters \(\rho\) and \(\beta _{AB}\) as small as possible in order to obtain a high value for \(\beta _{t,max}\) . \(\rho\) must be reduced because the more the paths are equalized the faster quantum information must propagate from the first detection event towards the second one. Concerning \(\beta _{AB}\) , if one knows with extreme accuracy the time \(t_{AB}\) at which the baseline and the PF velocity are orthogonal—and this occurs at least twice per day for every possible PF in the case of perfect east-west orientation for the baseline—then in principle one can set \(\beta _{AB}=0\) if he performs a Bell’s test at that precise time. Actually, the time interval \(\delta _S t\) required to acquire a value of S is always finite, so that if one arranges to perform the S measurement in the interval \([t_{AB}-\delta _St/2,t_{AB}+\delta _S t/2]\) , we can prove 8 that, for a baseline A-B perfectly aligned toward the east-west direction, the following upper bound for \(\beta _{AB}\) is achieved in that interval of time:

where \(\omega =7.29\times 10^{-5}\) rad/s is the Earth rotation angular velocity, while the polar angle \(\chi\) is the angle between the relative velocity vector of the PF in the laboratory frame and the Earth’s rotation axis. The last equation in ( 2 ) follows from \(\omega \delta _S t/2\ll 1\) , being \(\delta _S t\) usually of the order of few seconds.

Formula ( 2 ) needs further corrections in order to be considered realistic. Indeed, we have to take into account the finite uncertainty \(\sigma _{AB}\) in determining \(t_{AB}\) and the fact that, even if one performs Bell’s tests in series without solution of continuity, there will always be a finite time step \(\delta _{step} t\) between the acquisition of one value of S with the next/previous one. In this case, as shown in   \(t_{AB}\) rough determination , we have to replace ( 2 ) with

In the following, we will try to make \(\delta _S t\) , \(\sigma _{AB}\) and \(\delta _{step} t\) as small as possible.

In order to perform a series of Bell tests, we generate the antisymmetric Bell state: \(\left| \psi ^- \right\rangle = \frac{1}{\sqrt{2}} \left( \left| HV \right\rangle - \left| VH \right\rangle \right)\) with a good but not exceptional visibility (about 80 per cent) and a signal to noise ratio of about 22 on Horizontal/Vertical (H/V) and Diagonal/Antidiagonal (D/A) bases (Fig.  1 ). Because of the experiment requirements we have paid more attention to generation rate instead of two photon interference visibility to reduce the acquisition time \(\delta _S t\) .

Figure  1 shows the two photon interference fringes on H/V and D/A bases measured with a contrast and a signal to noise ratio sufficient to violate Bell inequality with more than 3 standard deviations. Polarization axis’s angles are indicated with \(\xi _A\) and \(\xi _B\) .

In this paper, we will assume CMB frame (the frame where cosmic microwave background radiation is isotropic) to be the preferred one. As assumed in several works 25 , 26 , 27 , CMB frame is the natural choice as candidate preferred frame. As detailed in paragraph   CMB frame velocity , we calculated the projection, along the A-B direction, of the vector of relative velocity \(\vec {\beta }(t)\) of the CMB frame, with respect to the laboratory frame: for this task, Earth rotation and revolution motions, together with the latitude and longitude of the detectors, were considered. We performed the experiment when the baseline A-B was nearly orthogonal to \(\vec {\beta }(t)\) , with the result that simultaneous events along the baseline in the laboratory frame were seen simultaneous in CMB frame too.

On 20 December of 2021 the experiment started at 16:06:07 UTC, and we recorded the two detectors coincidences \(C\left( \xi _A,\xi _B\right)\) for 16 combinations of ( \(\xi _A\) , \(\xi _B\) ) polarizers orientations (shown in Table 1 ) in front of the detector (A,B) in a time window of 120 ps with \(\delta _c t=5\) s integration time. Then we calculated the correlations \(E\left( \xi _A,\xi _B\right)\) :

where \(\xi ^{90^{\circ }}=\xi +90^{\circ }\) . Then, in about 90 seconds the S value for the Bell test can be calculated:

figure 2

Polarization-correlation measurements Experimental data for the CHSH inequality violation. Each blue data point represents a value of S centered in the relative acquisition time interval \(\delta _S t\) , equals to 90 s in our experiment. The orange line represents the S mean value, while the violet area denotes the zone where no SAD takes place. Almost all of 97 data points violated the Bell-CHSH inequality by at least 3 standard deviations \(\sigma\) (measured as root mean square on repeated measurement of S) far from the local realistic bound of \(-2\) . The cyan point denotes the S acquisition corresponding to an experimental setup almost orthogonal to the CMB-Earth relative velocity. During its acquisition time, represented in Figure by the zone with diagonal grey strips, detection events A and B are nearly simultaneous, both in the laboratory and in the CMB reference frame. The cyan point remains far from the quantum domain bound \(S=-2\) for more than 3 standard deviations too. Finally, the short, vertical grey strip passing for the cyan point indicates the exact time, except for an uncertainty \(\pm \sigma _{AB}\) , in which \(\beta _{AB}=0\) (see Paragraph CMB frame velocity ).

The experiment lasted about eleven minutes, during which time we performed seven cycles of the 16 counting measurements—each for every polarizers orientations reported in Table   1 —required to determine S , according to ( 5 ) and ( 6 ). In order to increase the number of S measurements in the same lapse time, we applied the following methodology: the first value of S was achieved by considering counting measurements from the first one to the 16th one, the second value of S from the second counting measurements to the 17th one, that is, the first one from the second cycle, and so on. With this series of overlapping measurements for S , we collected \(16\times 6+1=97\) acquisitions for this quantity, from seven ones in the case of separate, distinct estimations of S , so achieving the goal to reduce \(\delta _{step} t\) . Indeed, once acquired an S value, for the next value of S we had just to wait the mean time \(\delta _r t\) for rotating the polarizers and the time \(\delta _c t\) to perform a new, single, counting measurement. In synthesis,

All 97 measurements of S are reported in Fig.  2 with their time distribution; the cyan point denotes the S measurement acquired when \(\beta _{AB}=0\) , that is, when the simultaneity of detection events in laboratory corresponds to simultaneity of event in the CMB frame. If the speed of spooky action were not sufficient to link the detection events, S should assume values greater than \(-2\) when the projection of \(\vec {\beta }(t)\) along the two detectors baseline is zero ( \(\beta _{AB}=0\) ), and should assume value lower than \(-2\) in the other time intervals where such projection is non zero. It is clear from the figure that all values of S are very below the local realistic theory bound. In particular, the S value in cyan in Fig.  2 is more than 4 standard deviation far from local realistic theory bound. Consequently, by inserting the numerical values of the experiment as calculated in paragraphs   CMB frame velocity and   Numerical estimation of \(\beta _{t,max}\) , an improved bound on “speed of spooky action” of about \(3.3\times 10^4 c\) is obtained (tested only in the CMB frame).

In conclusion, thanks to a smart arrangement and to the measurement of the entangled photons coherence time, we have obtained better results compared to several kilometers long experiments, despite adopting only a few meters baseline. The use of ordinary laboratory set-up, with an accurate east-west orientation, will allow in next future through a 12 hours experiment to test all the possible reference frame, otherwise impossible to achieve with the typical infrastructural constraints of extended experiments. Moreover, the small scale and the fast acquisition time enables to adjust the environment degrees of freedom that are otherwise uncontrollable. Here we do not address the loopholes problem but only perform a feasibility test for measuring the lower bound of SAD in a table top experiment to unlock a new family of experiments that will evolve both in closing the loopholes and extend obtained bound. This bound shows considerable room for improvement thanks to the developed setup that can be, thanks to its reduced dimension, easily extended.

Experimental Setup

A 5 mW cw laser source at 775 nm pumps a heralded photon source (HPS in Fig.  3 ) consisting of a type II waveguide of a Periodically Poled Lithium Niobate (PPLN) crystal temperature controlled, that generates more than 1 milion of photon pairs per second. The crystal temperature is maintained at optimal temperature of \(33.90^{\circ }\) within \(0.01^{\circ }\) by a PID controller. The central wavelength of down converted photons is centred at about 1550 nm.

The pairs are coupled to a compensating polarization maintaining optical fiber of suitable length to countermeasure the delay between horizontal and vertical photons and to recover temporal indistinguishability.The exact length of the compensating fiber is 98 cm and was selected experimentally, after several attempts, by maximizing the mean coincidence visibility for (H/V/D/A) bases. The output of the compensating fiber is coupled with a dichroic beam splitter centered at 1550 nm to separate the entangled photons that, using two collimators, are sent separately to two quarter waveplates and two half waveplates for state preparation. All the used optics: collimators, quarter waveplates and half waveplates have the antireflection coating centered at 1550 nm. The waveplates, for state preparation, are rotated to minimize the coincidence counts for orthogonal polarizations (set through the polarizers P in front of two detectors) in both Horizontal/Vertical base and Diagonal/Antidiagonal base. All experimental conditions (crystal temperature, laser power, optical layout, etc.) are optimized to obtain a large generation rate (and so a fast acquisition) with the only constraint of having two photons interference visibility and S/N ratio just sufficient to violate Bell inequality with 3 standard deviations. The half waveplates and quarter waveplates are mounted on motorized rotation stage and the acquisition is managed by an home made LabVIEW developed software that controls the time tagger and rotational stages. The prepared photons, that are described by the Bell state \(\left| \psi ^- \right\rangle = \frac{1}{\sqrt{2}} \left( \left| HV \right\rangle - \left| VH \right\rangle \right)\) , impinge on two thin film polarizers and are detected by two InGaAs/InP single-photon avalanche diode cooled at \(-90\,^\circ\) C. The quantum efficiencies of detectors are about 20%, the dead times are set to 2 and 8 \(\mu s\) respectively and the dark counts rates are several kHz. The detectors generate TTL pulses recorded by a time-to-digital converter (time tagger, Qutools) with a resolution of 10 ps. In this experiment, the coincidence integration time is set to 5 s and the time window to detect the coincidence in is set to 120 ps. Finally one polarizer is mounted on a micrometer translation stage to equalize the paths of the two entangled photons. The laser source, the heralded photon source, spectral filter and state preparation optics are very light (about 5 kg) and stay in a 25 cm cube box to facilitate portability in future experiments or space missions.

figure 3

Experimental setup. A laser pumps the heralded photon source (HPS) to generate photon pairs. The pairs are coupled to a compensating polarization maintaining optical fiber to recover temporal indistinguishability. The output of the compensating fiber is coupled with a dichroic beam splitter (SF) to separate the entangled photons that, using two collimators (C), are sent separately to two quarter waveplates ( \(\lambda /4\) ) and two half waveplates ( \(\lambda /2\) ) for state preparation. The photons are sent to opposite directions on two polarizers (P) and detectors (SPD). The detectors generate TTL pulses recorded by a time-to-digital converter (TT) and processed using a personal computer (PC). Finally one polarizer is mounted on a micrometer translation stage (TR) to equalize the paths of the two entangled photons.

CMB frame velocity

In this section we determine the projection of the CMB frame velocity with respect to the laboratory reference frame, \(\vec {v}_{CMB,L}\) , along the baseline A-B of our experiment. In particular we focus on determining the precise instant of time \(t_{AB}\) in which such projection is null, meaning the baseline direction to be at rest with the CMB reference frame.

A CMB frame is a reference frame where CMB presents an isotropic spatial distribution. Detection of anisotropies in CMB occurred since its discovery in 1965 by Penzias and Wilson 28 and soon attributed to the Earth relative motion 29 . Successive experiments 30 , 31 , 32 clarified that such spatial anisotropies mainly present a dipole structure—higher multipole moments contributions to total CMB are of the order of \(10^{-3}\) with respect to the dipole ones 33 —that more recent observations 34 , 35 have set equal to a temperature of \(3362.08\pm 0.99\) \(\mu\) K along the direction \(l=264.021^{\circ }\pm 0.011^{\circ }\) , \(b=48.253^{\circ }\pm 0.005^{\circ }\) in Galactic coordinates. Assuming no-intrinsic CMB anisotropy at least of this order of magnitude, such deviation results compatible with a Doppler shift due to a relative motion of the Solar system with respect to a CMB reference frame \(S_{0}\) with velocity in module \(v_{S,CMB}=(369.82\pm 0.11)\) km/s and, by setting \(S_{0}\) axes parallel to those of ICRF/J2000 Equatorial System, with Right Ascension (RA) \(\alpha =167.942^{\circ }\pm 0.007^{\circ }\) and declination \(\theta =-6.944^{\circ }\pm 0.007^{\circ }\) 35 .

The estimation of the velocity \(\vec v_{CMB,L}\) of \(S_{0}\) , the natural candidate as PF, with respect to the laboratory, is performed in two steps: firstly, following an approach similar to 7 , we applied some simplifications concerning in particular Earth orbital motion, in order to easily obtain a rough estimation of \(t_{AB}\) , denoted by \(t^{(0)}_{AB}\) , with uncertainty \(\sigma _{AB}^{(0)}\) . Once restricted the temporal window for this event to happen, by recurring to precise Earth ephemeris tables and Earth rotation angles, we achieved a better estimation of \(\vec v_{CMB,L}\) and consequently of \(t_{AB}\) , with an uncertainty \(\sigma _{AB}\) that we will see to be an order of magnitude lower than \(\sigma _{AB}^{(0)}\) .

\(t_{AB}\) rough determination

Together with \(S_0\) , we consider two additional reference frames: the Heliocentric reference frame \(S_{1}\) , centered at the Sun with axes parallel to to J2000, and the Geocentric Equatorial frame J2000, that is, the frame centered at the Earth with the x -axis directed toward the vernal point at date 1st of January 2000, 12:00 UTC, the z -axis along the Earth rotation axis toward the north and y -axis in order to form a counter-clockwise orthogonal frame, denoted by \(S_2\) (see Fig. 4 ). In the following, we will initially assume a perfect circular orbit for the Earth, we will assume that clocks march with the same rate in \(S_0\) , \(S_1\) and \(S_2\) —that is, absolute time approximation—and we will neglect relativistic effects to the velocity composition rule as well. By neglecting these and other correction terms, our results will subject to some errors, which we will take into account below.

figure 4

( a ) Earth orbital position at time \(t=t_0=\) 23th September 2021, 02:40:12 UTC. Earth occupies the vernal point \(\gamma\) with respect to the Sun, and the Greenwich meridian is rotated of an angle \(\varphi _0\simeq 41.96^{\circ }\) with respect to the \(x-z\) plane. ( b ) Earth orbital position at time \(t=t_{AB}=\) 20th December 2021, 16:11:22 UTC, from a different prospective. Earth is very near to the southern solstice. Blue vector velocities refer to the \(S_0\) frame, while black ones to the \(S_1\) frame. Blue vectors velocities refer to the \(S_0\) frame, with the dashed ones indicating the projection of \(\vec v_{S,CMB}\) on the celestial equatorial plane, while the black ones refers to the \(S_1\) frame.

In rectangular coordinate, momentarily neglecting Sun’s motion with respect to the Solar System barycenter, Sun’s velocity in the \(S_0\) reference frame reads

With respect to the Sun, Earth center rotates with mean angular velocity \(\Omega =1.991\times 10^{-7}\) rad/s. Being \(R=1.4960\times 10^8\) km the mean Earth-Sun distance, and neglecting Earth orbit eccentricity, Earth velocity \(\vec {v}_{E,S}\left( t\right)\) at time t in \(S_1\) rectangular coordinates reads:

where \(\delta =23.44^{\circ }\) is the Earth orbit obliquity with respect to the Celestial equator, while \(t_0\) is the last time in which Earth occupied the vernal equinox point (or, equivalently, when the Sun occupied the autumnal equinox point with respect to the Earth). According to the Horizons Web Application 36 , this occurrence happened at 2:40:12 UTC, 23th of September 2021, or equivalently at Julian Date \(\text{ JD }=2459480.61125\) . Difference \(t-t_0\) is measured in seconds.

Finally, the laboratory velocity at time t with respect to the Earth-centered J2000 frame is given by

where \(\omega =7.29\times 10^{-5}\) rad/s is angular velocity of the Earth rotation along its axis, r is the mean Earth radius (or the Earth radius at the laboratory latitude if one takes into account Earth oblateness), \(\theta _{lab}\) and \(\phi _{lab}\) are the laboratory latitude and longitude respectively, while \(\varphi _0\) is the Earth Rotation Angle (ERA), that is, the angle between the Greenwich meridian and the J2000 x-z plane, at time \(t_0\) . In our case \(r=6369.6\) km, \(\theta _{lab}=40.65^{\circ }\) , \(\phi _{lab}=16.7^{\circ }\) , while, denoted with \(d=\text{ JD }-2451545\) the fractional number of days at \(t_0\) from the 1st of January 2000, 12:00 UTC, then according to 37

or equivalently \(\varphi _0=41.96^{\circ }\) . Observe that in formula ( 10 ) d actually denotes the Julian date with respect to the UT1 time; however, at this stage, we can safely neglect UT1-UTC time difference, accounting to \(\sim 0.1\) s for this date as reported in IERS Bullettin A 38 . Inserted \(\varphi _0\) in ( 9 ), by applying Galilean composition velocities rule we obtain the laboratory velocity \(\vec v_{L,CMB}\) with respect to the CMB reference frame at time t :

with \(\vec v_{S,CMB}\) and \(\vec v_{E,S}\) given by ( 7 ), ( 8 ) respectively. Then \(\vec {v}_{CMB,L}=-\vec {v}_{L,CMB}\) .

About the baseline A-B of our experiment, defined by the two detectors positions \(\vec x_A\) and \(\vec x_B\) , its orientation \(\vec e_{AB}=\frac {\left( \vec x_B-\vec x_A\right) }{\left| \vec x_B-\vec x_A\right| }\) performs an easily predictable pattern with respect to the CMB frame. We will calculate it with respect to the J2000 Earth frame \(S_2\) , that shares with \(S_0\) the same axes directions.

Denoted by \(\theta _A,\ \phi _A\) and \(\theta _B,\ \phi _B\) the latitude and longitude of the first and second detector respectively, after few calculations we get

In our case, the baseline is located towards the East–West direction—same latitude for the two vertexes—and is \(\sim 7\) m long, corresponding to a longitude displacement of 0.295 arcseconds.

Finally, the CMB frame relative velocity projection along the baseline at time t will be

We are now ready to determine times in which \(\beta _{AB}=0\) . About the day we performed the experiment, the 20th of December 2021, we numerically found \(t^{(0)}_{AB}=\) 16:10:52 UTC, as root of Eq. ( 13 ). This value is obviously affected by some errors due to uncertainties concerning 1) actual frame relative velocities, 2) time/phase shifting due to both uncorrected setting for initial positions or neglecting time rate differences among various frames.

Concerning point 1), denoted by \(\Delta \vec v\) the velocity deviation from \(\vec v_{CMB,L}\) of the CMB frame with respect to the laboratory, only its component along the baseline direction determines a displacement \(\Delta t\) about the root \(t^{(0)}_{AB}\) of \(\beta _{AB}\left( t\right)\) . If the baseline lies on the plane parallel to the plane \(z=0\) of \(S_0\) , that is, on a plane perpendicular to the Earth rotation axis as in our case, then from Fig. 5 we deduce that

where \(\vec v_{CMB,L}^{P}\) is the projection of \(\vec v_{CMB,L}\) on the plane \(z=0\) . For \(t=t^{(0)}_{AB}\) , we have \(\left| \vec v_{CMB,L}^{P}\right| =398.74\) km/s. For what concerns Solar System velocity with respect to CMB, as we said at the beginning of the Section, we have an uncertainty in module of 0.11 km/s, and in both RA and declination of \(0.007^{\circ }\) . By applying the first equation of ( 14 ), this determines an uncertainty of \(\left| \Delta t_{CMB}\right| \simeq 1.56\) s for the event. About Earth velocity variations in its revolutionary motion, mainly due to the elliptic trajectory and the 2nd Kepler’s law, after few calculations we see that Earth velocity module maximally deviates from its mean value of a quantity

at the aphelion and perihelion. The term \(e\simeq 0.0167\) is the Earth eccentricity 39 . For what concerns angular deviations of Earth actual velocity vector from that given in Eq. ( 8 ) in the hypothesis of a circular orbit, we have to consider two factor: the angular difference between the tangent line to a circle and to an ellipse at the same anomaly, and the difference between the true and the mean anomaly, calculated from the vernal point (anomaly refers to the angular position of a celestial body along its orbit). This last one is due to the fact that angular velocity is not constant for an elliptic orbit. The first deviation is limited to \(\sim e^2/2\simeq 3\times 10^{-4}\) rad, and is negligible with respect to the second one for which we have a maximal angle of \(\sim \pi e/2\simeq 0.026\) rad. Taking into account these data together with ( 15 ), it results that \(\left| \Delta \vec v \right| \simeq 0.93\) km/s, so that from the second equation in Eq. ( 14 ) we get \(\left| \Delta t\right| \lesssim 32.3\) s. About point 2), their contribution can be neglected. For example, if we synchronize \(S_1\) and \(S_2\) clocks when the Earth occupies the vernal point, at time \(t^{(0)}_{AB}\) in the \(S_2\) frame we register a time difference with respect to \(S_1\) of \(\sim 0.08\) s. In conclusion, taking into account \(\Delta t_{CMB}\) as well, uncertainty in determining null velocity projection is bounded to \(\pm \sigma _{AB}^{(0)}\) , with \(\sigma _{AB}^{(0)}\sim 34\) s.

\(t_{AB}\) accurate determination

We now improve \(t_{AB}\) accuracy, by replacing velocities \(\vec v_{E,S}\) of ( 8 ), concerning Earth orbital motion around the Sun, with analogous velocities from Earth ephemeris tables. Such velocities are nowadays referred with respect to the International Celestial Reference Frame (ICRF) 40 , 41 , centered at the Solar System barycenter; in this way, together with the actual Earth elliptic orbit, other several contribution previously neglected, like the Earth-Moon motions around their common barycenter, the sun motion around the Solar System barycenter and relativistic clocks marching differences are taken into account. For this task we choose again the Earth ephemeris tables from the Horizons Web Application, relative to a time window centered at 16:10:52 UTC with an amplitude of \(\pm 3\cdot \sigma _{AB}^{(0)}\simeq \pm 100\) s 42 . In addition to this, for the same time window we replace the term \(\omega \left( t-t_0\right) +\varphi _0\) , where t denotes time in UTC, with the more precise ERA, as given by ( 10 ), in the expressions ( 9 ) and ( 12 ) concerning \(\vec v_{L,E}\) and \(\vec e_{AB}\) definitions respectively. Some cares must be taken about the various time coordinates involved: velocities ephemeris tables from Horizons are reported for given values of Barycentric Dinamical Times (TDBs), ERA formula ( 10 ) requires UT1 temporal coordinates, while all the activities in the laboratory frames were synchronized with UTC. In order to link all these temporal coordinates, we recovered time differences TDB-UTC from the Horizons Web Application, and time differences UT1-UTC from the IERS Bullettin A. In the time window considered, we have a constant value of TDB-UTC = 69.1836 s 42 , while for the 20th of December we have UT1-UTC = \(-0.1084\) s 38 . Finally, we estimate \(\vec v_{CMB,L}\) by applying again the Galilean composition law for velocities. Indeed, we can prove that the relativistic corrections to the classical linear sum of velocities amounts to \(\sim 6\cdot 10^{-4}\) km/s in this case, well below the uncertainty 0.11 km/s for the CMB frame velocity, and for this reason they can be neglected.

figure 5

Velocity Errors Propagation. Uncertainty \(\Delta \vec {v}\) in \(\vec {v}_{CMB,L}\) determines an uncertainty of an angle \(\zeta\) on the direction of \(\vec {v}_{CMB,L}\) with respect to the baseline A-B. Earth angular rotation then causes an uncertainty \(\Delta t=\frac { \zeta }{\omega }\) on time t for the null projection of \(\vec {v}_{CMB,L}\) along the baseline.

In Table 2 we report the quantities \(\vec v_{CMB,L}\) and \(\vec e_{AB}\) so obtained, together with the relative values of \(\beta _{AB}\) , for some UTCs. As we can see, a zero value for \(\beta _{AB}\) occurs between 16:11:21.8 and 16:11:22.3 UTC (Horizons ephemeris are reported for time steps not below 0.5 s). So, we set \(t_{AB}=\) 16:11:22, with an uncertainty of \(\pm 0.3\) s. This uncertainty is actually enclosed in an uncertainty of \(\pm 0.5\) s in manually setting the start of the experiment, so that, taking into account the quantity \(\Delta t_{CMB}=1.56\) s previously determined, we get

for the absolute error. Additional uncertainties of 1 arcsec - that is, \(5\times 10^{-6}\) rad—in mean longitude coordinate for the polarizers can be neglected, since they determine a time error of \(\sim 5\times 10^{-6}/\omega \simeq 0.07\) s, while for a displacement of 1 arcsec from the perfect east-west alignment for the baseline we have a time error of \(\sim 5\times 10^{-6}/\left( \omega \tan {\chi }\right) \simeq 0.008\) s, as we can deduce from 8 . About the parameters \(\beta\) and \(\chi\) to insert in Eq. ( 20 ) below, for the time of the experiment we found \(\beta =1.33\times 10^{-3}\) and \(\chi =83.60^{\circ }\) .

Derivation of equation 3

In order to prove that equation 2 must be replaced with equation 3 and 4, let us first suppose that \(0\le \sigma _{AB}\le \delta _{step}t/2\) . Denoted by \(t_{AB}^{(1)}\) the estimated time for \(t_{AB}\) , with uncertainty \(\sigma _{AB}\) , the time interval for the S measurement in which we expect to find \(\beta _{AB}=0\) will be \(\left[ t_{AB}^{(1)}-\delta _S\,t/2,t_{AB}^{(1)}+\delta _S\,t/2 \right]\) . Such interval can be read as \(\left[ t_{AB}-\delta _S\,t/2-\left( t_{AB}-t^{(1)}_{AB}\right) ,t_{AB}+\delta _S\,t/2 -\left( t_{AB}-t^{(1)}_{AB}\right) \right]\) , and following the same argument as in ( 2 ), in this time interval we find the upper bound:

If \(\delta _{step}t/2<\sigma _{AB} \le 3\delta _{step}t/2\) , we have two subcases: (a) \(\left| t_{AB}-t^{(1)}_{AB}\right| \le \delta _{step}t/2\) and (b) \(\delta _{step}t/2<\left| t_{AB}-t^{(1)}_{AB}\right| \le 3\delta _{step}t/2\) . In the subcase (a) we have

In the subcase (b), \(t_{AB}\) is actually nearer to \(t_{AB}^{(0)}\) or \(t_{AB}^{(2)}\) , the times at which are centered the previous and the next S acquisition period respectively, being \(t^{(2)}_{AB}-t^{(1)}_{AB}=t^{(1)}_{AB}-t^{(0)}_{AB}=\delta _{step}t\) by the same definition of \(\delta _{step}t\) . So, the boundary of \(\beta _{AB}(t)\) has to be calculated inside one of these adjacent time intervals. Now it is easy to see that \(\delta _{step}t/2<\left| t_{AB}-t^{(1)}_{AB}\right| \le 3\delta _{step}t/2\) implies \(\left| t_{AB}-t^{(0)}_{AB}\right| \le \delta _{step}t/2\) or \(\left| t_{AB}-t^{(2)}_{AB}\right| \le \delta _{step}t/2\) . By repeating the same arguments as above, we achieve

So, the higher bound for \(\beta _{AB}(t)\) cannot exceed \(\omega (\delta _S\ t/2+\delta _{step}t/2)\beta \sin {\chi }\) even when \(\delta _{step}t/2<\sigma _{AB}\le 3\delta _{step}t/2\) . The same argument can be applied also in the case \(\sigma _{AB} \ge 3 \delta _{step}t/2\) , if we consider additional S acquisition time intervals.

We can summarize what said above by asserting the following upper bound for \(\beta _{AB}\)

Observe that if \(\sigma _{AB}= 12\) h, that is, no assumption about the PF, and if \(\delta _S\, t = \delta _{step}\, t< 12\) h, as in the case of a continuous sequence of non overlapping measurements for S, then \(\delta t = 2 \delta _S\, t\) and we recover the same expression for \(\delta t\) as that derived in 9 .

Numerical estimation of \(\beta _{t,max}\)

Taking into account of Eqs. ( 1 ) and ( 3 ), the bound on the speed of spooky action \(\beta _{t,max}\) that is possible to constrain is:

Earth angular velocity is fixed— \(\omega =7.29\times 10^{-5}\) rad/s—while the modulus of Earth-PF relative velocity \(\beta\) and its polar angle \(\chi\) vary, but just slightly, over the year. So, the main effort was to reduce \(\delta t\) and \(\rho\) .

The term \(\delta t\) is reported in Eq. ( 4 ), with \(\delta _{step} t\) given by the sum of \(\delta _c t\) and \(\delta _r t\) . \(\delta _c t\) was set equal to 5 s, which is the minimum time required to obtain a reliable number of coincidence. We set the coincidence window on a time tagger—even if not present in Eq. ( 20 ), it nevertheless affects the coincidences number—by choosing the minimum time \(t_{min}\) below which the coincidence number C rapidly decreases and above which C does not increase (in other words if we choose a lower \(t_{min}\) we would miss real coincidence, if we choose a larger \(t_{min}\) we would take spurious coincidence). About \(\delta _r t\) , the electric rotors required \(\simeq 0.6\) s to modify polarizers orientation each time, so that \(\delta _{step}t\simeq 5.6\) s, greater than \(2\sigma _{AB}=4\) s, as estimated in Paragraph CMB frame velocity . Then, needing 16 coincidence counting measurements in order to estimate S , it results \(\delta _S t=16\left( \delta _c t+\delta _r t\right) \simeq 90\) s, and finally we achieve \(\delta t=\delta _S t+2\sigma _{AB}=94\) s.

The quantity \(\rho\) represents the goodness of the relative balance between the paths, and it is given by

Here, \(d_{AB} = d_1+d_2 = (6923.1\pm 0.1)\) mm is the length of the baseline A-B, more specifically the distance between the polarizers in front of the detectors and \(d_1\) ( \(d_2\) ) is the length of the first(second) arm of the experiment.

In this experiment we used absorption polarizers, consequently we can assume that the wave function collapse happens at the polarizers 9 . Indeed, absorption polarizer behaves like a measuring device, so that a photon, after passing through it, can be adsorbed (vacuum state) or linearly polarized.

\(\Delta d\) represents the uncertainty of the equalization of the effective optical paths, from the source to both the polarizers at A and B, and has the following contributions:

the geometrical uncertainty in the balance of two arms of the experiments, namely the uncertainty of \(d_1-d_2\) . \(\delta d_{12} \simeq 0.1\) mm

the coherence time of the photon pairs \(c \delta \tau \simeq 0.13\) mm. It was measured before spectral filter separation using HOM dip measurement as shown in Fig.  6 , and recalculated to take into account the spectral filter temporal broadening.

the finite thickness of the absorption polarizers \(\delta d_{pol}\) . More precisely, we used LPNIR050-MP2 polarizers from Thorlabs, whose tickness is about 220 \(\upmu\) m. The extinction ratio at 1550 nm is 953000 and so 99 per cent of photons with orthogonal polarization are adsorbed in a layer of about 75 \(\upmu\) m. From these considerations, we can assume \(\delta d_{pol} \simeq 0.075\) mm.

figure 6

Hong Ou Mandel dip. The pairs generated from the HPS are separated using a polarization beam splitter. A \(45^{\circ }\) half waveplate placed in one arm makes two photons polarization parallel. The photons are sent on two input ports of a non polarizing beam splitter where two single photon detectors are coupled with output ports. The coincidence counts are acquired by changing the relative path between photons through a micrometer translator. The HOM dip is used to extract the coherence time of the photons through a simple unweighted fitting procedure.

By adding the various contributions we have:

so that we get \(\rho \simeq 2.6\times 10^{-5}\) .

Effects due to temperature variation \(\Delta T\) are negligible ( \(\Delta T < 0.1 ^\circ\) C) since the experiment is performed in a controlled laboratory and lasts only eleven minutes.

Data availability

The data that support the findings of this study are available from the corresponding authors on request.

Bell, J. S. On the Einstein Podolsky Rosen paradox. Physics 1 , 195 (1964).

Article   MathSciNet   Google Scholar  

Einstein, A., Podolsky, B. & Rosen, N. Can quantum-mechanical description of physical reality be considered complete? Phys. Rev. 47 , 777 (1935).

Article   ADS   CAS   MATH   Google Scholar  

Clauser, J., Horne, M., Shimony, A. & Holt, R. Proposed experiment to test local hidden-variable theories. Phys. Rev. Lett. 23 , 880 (1969).

Article   ADS   MATH   Google Scholar  

Aspect, A. Closing the door on Enstein and Bohr’s quantum debate. Physics 8 , 123 (2015).

Article   Google Scholar  

Bertlmann, R. & Zeilinger, A. QUANTUM [UN]SPEAKABLES II, Half a Century of Bell’s Theorem, The Frontiers Collection (Springer, 2017).

Eberhard, H. Quantum Theory and Pictures of Reality (Springer, 1989).

Scarani, V., Tittel, W., Zbinden, H. & Gisin, N. The speed of quantum information and the preferred frame: Analysis of experimental data. Phys. Lett. A 276 , 1–7 (2000).

Article   ADS   MathSciNet   CAS   MATH   Google Scholar  

Salart, D., Baas, A., Branciard, C., Gisin, N. & Zbinden, H. Testing the speed of ‘spooky action at a distance’. Nature (London) 454 , 861 (2008).

Cocciaro, B., Faetti, S. & Fronzoni, L. Improved lower bound on superluminal quantum communication. Phys. Rev. A 97 , 052124 (2018).

Article   ADS   CAS   Google Scholar  

Yin, J. et al. Lower bound on the speed of nonlocal correlations without locality and measurement choice loopholes. Phys. Rev. Lett. 110 , 260407 (2013).

Article   ADS   PubMed   Google Scholar  

Aerts, S., Kwiat, P., Larsson, J.-Å. & Żukowski, M. Two-photon franson-type experiments and local realism. Phys. Rev. Lett. 83 , 2872–2875. https://doi.org/10.1103/physrevlett.83.2872 (1999).

Kofler, J., Ursin, R., Brukner, C. & Zeilinger, A. Comment on: Testing the speed of ’spooky action at a distance’. arXiv: Quantum Physics (2008).

Barnea, T. J., Bancal, J.-D., Liang, Y.-C. & Gisin, N. Tripartite quantum state violating the hidden-influence constraints. Phys. Rev. A . https://doi.org/10.1103/physreva.88.022123 (2013).

Bancal, J.-D. et al. Quantum non-locality based on finite-speed causal influences leads to superluminal signalling. Nat. Phys. 8 , 867–870. https://doi.org/10.1038/nphys2460 (2012).

Article   CAS   Google Scholar  

D’Auria, V. et al. A universal, plug-and-play synchronisation scheme for practical quantum networks. npj Quantum Inf. . https://doi.org/10.1038/s41534-020-0245-9 (2020).

Zeilinger, A. Quantum teleportation, onwards and upwards. Nat. Phys. 14 , 3–4. https://doi.org/10.1038/nphys4339 (2018).

Kerstel, E. et al. Nanobob: A CubeSat mission concept for quantum communication experiments in an uplink configuration. EPJ Quantum Technol. . https://doi.org/10.1140/epjqt/s40507-018-0070-7 (2018).

Gündoğan, M. et al. Proposal for space-borne quantum memories for global quantum networking. npj Quantum Inf. . https://doi.org/10.1038/s41534-021-00460-9 (2021).

Pirandola, S. & Braunstein, S. L. Physics: Unite to build a quantum internet. Nature 532 , 169–171. https://doi.org/10.1038/532169a (2016).

Article   ADS   CAS   PubMed   Google Scholar  

Kómár, P. et al. A quantum network of clocks. Nat. Phys. 10 , 582–587. https://doi.org/10.1038/nphys3000 (2014).

Defienne, H., Ndagano, B., Lyons, A. & Faccio, D. Polarization entanglement-enabled quantum holography. Nat. Phys. 17 , 591–597. https://doi.org/10.1038/s41567-020-01156-1 (2021).

Fink, M. et al. Experimental test of photonic entanglement in accelerated reference frames. Nat. Commun. . https://doi.org/10.1038/ncomms15304 (2017).

Zych, M., Costa, F., Pikovski, I. & Brukner, Č. Bell’s theorem for temporal order. Nat. Commun. . https://doi.org/10.1038/s41467-019-11579-x (2019).

Zbinden, H., Brendel, J., Gisin, N. & Tittel, W. Experimental test of nonlocal quantum correlation in relativistic configurations. Phys. Rev. A . https://doi.org/10.1103/physreva.63.022111 (2001).

Hardy, L. Quantum mechanics, local realistic theories, and Lorentz-invariant realistic theories. Phys. Rev. Lett. 68 , 2981–2984. https://doi.org/10.1103/physrevlett.68.2981 (1992).

Article   ADS   MathSciNet   CAS   PubMed   MATH   Google Scholar  

Ciborowski, J. & Rembieliński, J. Search for a preferred frame of the photon. Phys. Rev. A . https://doi.org/10.1103/physreva.100.032103 (2019).

Caban, P. & Rembieliński, J. Lorentz-covariant quantum mechanics and preferred frame. Phys. Rev. A 59 , 4187–4196. https://doi.org/10.1103/physreva.59.4187 (1999).

Penzias, A. & Wilson, R. A measurement of excess antenna temperature at 4080 Mc/s. ApJ 142 , 419–421. https://doi.org/10.1086/148307 (1965).

Article   ADS   Google Scholar  

Conklin, E. Velocity of the earth with respect to the cosmic background radiation. Nature 222 , 971–972. https://doi.org/10.1038/222971a0 (1969).

Henry, P. Isotropy of the 3 K background. Nature 231 , 516–518. https://doi.org/10.1038/231516a0 (1971).

Corey, B. & Wilkinson, D. A measurement of the cosmic microwave background anisotropy at 19 GHz. Bull. Am. Astron. Soc. 8 , 351 (1976).

ADS   Google Scholar  

Smoot, G., Gorenstein, M. & Muller, R. Detection of anisotropy in the cosmic blackbody radiation. PRL 39 , 898–901 (1977).

Smoot, G., Bennett, C., Kogut, A. et al . Structure of the COBE differential microwave radiometer first-year maps. ApJ 396 , L1–L5. https://doi.org/10.1086/186504 (1992).

Hinshaw, G. et al. Five-year Wilkinson microwave anisotropy probe observations: Data processing, sky maps, and basic results. Astrophys. J. Suppl. Ser. 180 , 225–245. https://doi.org/10.1088/0067-0049/180/2/225 (2009).

Planck Collaboration et al. Planck 2018 results-I. Overview and the cosmological legacy of Planck. A & A 641 , A1. https://doi.org/10.1051/0004-6361/201833880 (2020).

NASA Jet Propulsion Laboratory, Horizons Web Application. https://ssd.jpl.nasa.gov/horizons/app.html#/ . Ephemeris Type: Observer Table; Target Body: Earth-Moon Barycenter; Coordinate Center: Sun (Barycenter), Time Specification: Start=2021-09-23 02:40 UT , Stop=2021-09-23 02:41, Step=60 (fixed), Accessed: 2021-11-20.

Urban, S. E. & Seidelmann, P. K. Explanatory Supplement to the Astronomical Almanac (3rd edn.) (University Science Books, 2013).

International Earth Rotation and Reference System. https://datacenter.iers.org/ . Accessed 20 Nov 2021.

Simon, J. et al. Numerical expressions for precession formulae and mean elements for the Moon and the planets. A & A 282 , 663–683 (1994).

Ma, C. et al. The international celestial reference frame as realized by very long baseline interferometry. Astron. J. 116 , 516–546 (1998).

Charlot, P. et al. The third realization of the international celestial reference frame by very long baseline interferometry. A & A 644 , A159 (2020).

NASA Jet Propulsion Laboratory, Horizons Web Application. https://ssd.jpl.nasa.gov/horizons/app.html#/ . Ephemeris Type: Vector Table; Target Body: Earth (Barycenter); Coordinate Center: Solar System Barycenter, Time Specification: Start=2021-12-20 16:09 UT , Stop=2021-12-20 16:13, Step=480 (fixed), Accessed 20 Nov 2021.

Download references

Acknowledgements

We gratefully acknowledge support by the Italian Space Agency (ASI) through the Nonlinear Interferometry at Heisenberg Limit (NIHL) project (CUP F89J21027890005). This work was co-funded by European Union - PON Ricerca e Innovazione 2014-2020 FESR /FSC - Project ARS01_00734 QUANCOM.

Author information

Authors and affiliations.

Agenzia Spaziale Italiana, Centro Spaziale Matera, Contrada Terlecchia snc., 75100, Matera, Italy

Luigi Santamaria Amato, Deborah Katia Pallotti, Mario Siciliani de Cumis, Daniele Dequal & Andrea Andrisani

Centre for Quantum Dynamics and Centre for Quantum Computation and Communication Technology, Griffith University, Brisbane, QL, 4111, Australia

Sergei Slussarenko

You can also search for this author in PubMed   Google Scholar

Contributions

L.S.A. conceived the experiment; L.S.A., D.K.P and D.D. realized the apparatus, L.S.A and M.S.d.C. performed the data acquisition and analysis; A.A. performed the numerical calculation concerning reference frames and syncronization, S.S. assisted with the experimental design, all authors discussed the results and contributed to the final manuscript.

Corresponding author

Correspondence to Luigi Santamaria Amato .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Santamaria Amato, L., Pallotti, D.K., Siciliani de Cumis, M. et al. Testing the speed of “spooky action at a distance” in a tabletop experiment. Sci Rep 13 , 8201 (2023). https://doi.org/10.1038/s41598-023-35280-8

Download citation

Received : 21 July 2022

Accepted : 16 May 2023

Published : 21 May 2023

DOI : https://doi.org/10.1038/s41598-023-35280-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

By submitting a comment you agree to abide by our Terms and Community Guidelines . If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

quantum computer spooky action at a distance

share this!

January 5, 2018

Quantum 'spooky action at a distance' becoming practical

by Griffith University

Quantum 'spooky action at a distance' becoming practical

A team from Griffith's Centre for Quantum Dynamics in Australia have demonstrated how to rigorously test if pairs of photons - particles of light - display Einstein's "spooky action at a distance", even under adverse conditions that mimic those outside the lab.

They demonstrated that the effect, also known as quantum nonlocality , can still be verified even when many of the photons are lost by absorption or scattering as they travel from source to destination through an optical fiber channel. The experimental study and techniques are published in the journal Science Advances .

Quantum nonlocality is important in the development of new global quantum information networks, which will have transmission security guaranteed by the laws of physics. These are the networks where powerful quantum computers can be linked.

Photons can be used to form a quantum link between two locations by making a pair of photons that are "entangled" - so that measuring one determines the properties of its twin - and then sending one along a communication channel.

Team leader Professor Geoff Pryde said a quantum link had to pass a demanding test that confirmed the presence of quantum nonlocality between particles at either end.

"Failing the test means an eavesdropper might be infiltrating the network," he said.

"As the length of quantum channel grows, less and less photons successfully pass through the link, because no material is perfectly transparent and absorption and scattering take their toll.

"This is a problem for existing quantum nonlocality verification techniques with photons. Every photon lost makes it easier for the eavesdropper to break the security by mimicking entanglement."

Developing a method to test entanglement in presence of loss has been an outstanding challenge for the scientific community for quite some time.

The team used a different approach - quantum teleportation - to overcome the problem of lost photons.

Dr Morgan Weston, first author of the study, said they selected the few photons that survived the high-loss channel and teleported those lucky photons into another clean and efficient, quantum channel .

"There, the chosen verification test, called quantum steering, could be done without any problem," she said.

Quantum 'spooky action at a distance' becoming practical

"Our scheme records an additional signal that lets us know if the light particle has made it through the transmission channel . This means that the failed distribution events can be excluded up front, allowing the communication to be implemented securely even in the presence of very high loss."

This upgrade doesn't come easy - the teleportation step requires additional high-quality photon pairs on its own. These extra photon pairs have to be generated and detected with extremely high efficiency, in order to compensate for the effect of the lossy transmission line.

This was possible to achieve thanks to state of art photon source and detection technology, jointly co-developed with the US National Institute of Standards and Technology in Boulder, Colorado.

Although the experiment was performed in the laboratory, it tested channels with photon absorption equivalent to about 80 km of telecommunications optical fiber.

The team aims to integrate their method into quantum networks that are being developed by the Australian Research Council Centre of Excellence for Quantum Computation and Communication Technology, and test it in real-life conditions.

Journal information: Science Advances

Provided by Griffith University

Explore further

Feedback to editors

quantum computer spooky action at a distance

Saturday Citations: The Dark Energy Survey; the origins of colorblindness; the evolution of heads

quantum computer spooky action at a distance

Scientists come up with technology to recycle used clothes rather than simply burning them

20 hours ago

quantum computer spooky action at a distance

Research shows Adélie penguins must balance the benefits and costs of riding on sea ice during long-distance migration

quantum computer spooky action at a distance

Astronomers find spark of star birth across billions of years

quantum computer spooky action at a distance

Core-shell 'chemical looping' boosts efficiency of greener approach to ethylene production

quantum computer spooky action at a distance

New research deciphers biomineralization mechanism

21 hours ago

quantum computer spooky action at a distance

2023's record heat partly driven by 'mystery' process: NASA scientist

quantum computer spooky action at a distance

Study uncovers potential origins of life in ancient hot springs

quantum computer spooky action at a distance

Efficient antibody production 'wobbles,' new study finds

quantum computer spooky action at a distance

Cellular clean energy: Can mitochondria make more energy without collateral damage?

22 hours ago

Relevant PhysicsForums posts

In the depicted setup, what would happen (entangled photons split into 2 different paths).

5 hours ago

Antiparticles moving opposite direction in time?

17 hours ago

Double slit experiment with observer

Entanglement & superposition probabilities, understanding the derivation of the ginzburg criterion for the ising model.

Jan 12, 2024

A probability of field amplitude in QFT

Jan 11, 2024

More from Quantum Physics

Related Stories

quantum computer spooky action at a distance

Secure information transmission over 500m fiber links based on quantum technologies

Dec 1, 2017

Toward unbreakable encrypted messages

Sep 13, 2017

quantum computer spooky action at a distance

Researchers demonstrate quantum teleportation of patterns of light

Sep 21, 2017

quantum computer spooky action at a distance

Physicists use quantum memory to demonstrate quantum secure direct communication

Jun 12, 2017

Physicists add amplifier to quantum communication toolbox

Jun 1, 2017

quantum computer spooky action at a distance

A network of crystals for long-distance quantum communication

May 29, 2017

Recommended for you

quantum computer spooky action at a distance

'Sudden death' of quantum fluctuations defies current theories of superconductivity

quantum computer spooky action at a distance

Generating stable qubits at room temperature

quantum computer spooky action at a distance

First direct imaging of tiny noble gas clusters at room temperature

quantum computer spooky action at a distance

Research offers insights into the metal-to-insulator transition without breaking symmetry

quantum computer spooky action at a distance

In novel quantum computer design, qubits use magnets to selectively communicate

Jan 10, 2024

quantum computer spooky action at a distance

Observing macroscopic quantum effects in the dark

Let us know if there is a problem with our content.

Use this form if you have come across a typo, inaccuracy or would like to send an edit request for the content on this page. For general inquiries, please use our contact form . For general feedback, use the public comments section below (please adhere to guidelines ).

Please select the most appropriate category to facilitate processing of your request

Thank you for taking time to provide your feedback to the editors.

Your feedback is important to us. However, we do not guarantee individual replies due to the high volume of messages.

E-mail the story

Your email address is used only to let the recipient know who sent the email. Neither your address nor the recipient's address will be used for any other purpose. The information you enter will appear in your e-mail message and is not retained by Phys.org in any form.

Newsletter sign up

Get weekly and/or daily updates delivered to your inbox. You can unsubscribe at any time and we'll never share your details to third parties.

More information Privacy policy

Donate and enjoy an ad-free experience

We keep our content available to everyone. Consider supporting Science X's mission by getting a premium account.

E-mail newsletter

Spooky Action Is Real: Bizarre Quantum Entanglement Confirmed in New Tests

an abstract representation of an atom

Sorry to break it to you, Einstein, but it looks like the universe is one big dice game.

Two recent studies have confirmed that the " spooky action at a distance " that so upset Albert Einstein — the notion that two entangled particles separated by long distances can instantly affect each other — has been proven to work in a stunning array of different experimental setups.

One experiment closed two of the three loopholes in proofs of spooky action at a distance. Another found that quantum entanglement works over astonishingly large distances. And future tests are focused on making the final loophole as small as possible. [ 8 Ways You Can See Einstein's Theory of Relativity in Real Life ]

Overall, the new series of tests is simply confirming what physicists have long suspected.

"There is no hidden, more fundamental theory underneath quantum mechanics ," said Ronald Hanson, a physicist at Delft University in the Netherlands and the lead investigator in one of the new experiments.

But although the new tests don't break new theoretical ground, they could pave the way for quantum computing and perfectly secure communication technologies, Hanson said.

Entangled particles

In the 1920s and 1930s, physicists studying subatomic particles began scratching their heads. They found that the Schrödinger wave equation, the fundamental quantum mechanics equation, could not describe the individual state or position of some groups of particles, dubbed entangled particles , until each individual particle was measured. Once each particle was measured, the wave function "collapses," and the particle takes on a definite state.

In a 1935 paper, Einstein and his colleagues Boris Podolsky and Nathan Rosen created a thought experiment known as the EPR paradox (after the initials of their last names) to show some of the absurd implications of the wave equation. According to the rules of quantum mechanics, entangled particles travel in a kind of superposition of all their possible states. But even weirder, the wave equation implied that once measured, two entangled particles could somehow instantly communicate, much faster than the speed of light, to link up their states. Discounting this "spooky action at a distance," Einstein and his colleagues instead argued that some hidden variable must somehow affect the states of both particles. [ Twisted Physics: 7 Mind-Blowing Findings ]

Inequality and loophole

For decades, physicists were in limbo, unsure whether Einstein's hidden variable or the straightforward interpretation of the Schrödinger wave equation was correct. Then, in the 1960s, physicist John Stewart Bell proposed a straightforward test, known as Bell's Inequality, to test spooky action at a distance. If spooky action were real, Bell proposed, then entangled particles measured some distance apart would have correlated states more than a certain percentage of the time. And if some hidden variable were affecting these seemingly entangled particles, then entangled particles would have correlated states less than that fraction of the time.    

In hundreds of Bell experiments since then, physicists have found that entangled particles do seem to have correlated states at faster-than-light speeds.

But all of these tests have had at least a few caveats, or loopholes. One is that detectors used to measure entangled particles such as photons often miss many of the particle duos. Therefore, experiments were analyzing the statistics on only a small fraction of the photons, raising the possibility that the undetected photons could change the picture, Hanson said.

Another loophole is the idea that perhaps the two entangled particles could somehow communicate their state to each other before they are detected. The third loophole is the idea that the random choice of an entangled state is not random at all, but somehow biased in a way humans don't perceive.

Closing the loopholes

Now, researchers are starting to close those loopholes.

For instance, University of Vienna physicist Anton Zeilinger and his colleagues showed that entangled particles that are 89 miles (143 kilometers) apart still act as quantum mechanics predicts they would. The test, described in a paper published Nov. 5 in the journal Proceedings of the National Academy of Sciences , relies on a massive detector set up on Spain's Canary Islands. (Some argue that the actual photons in this experiment are only entangled over a short distance, and that the experiment is a demonstration of long-distance quantum teleportation , not entanglement, Hanson said.)

And just a few weeks before that, in a paper published Oct. 23 in the journal Nature (and originally in the open-access, preprint journal arXiv ), Hanson and his colleagues showed that Bell's inequality holds even with the first two loopholes closed simultaneously.

To close the loopholes, Hanson and his team used a novel material: diamonds with a nitrogen vacancy defect, or a hole in the atomic matrix where an atom should be. That hole traps extra electrons, which become the particles to be entangled. So the team used two separate diamond crystals , separated by almost 1 mile (1.6 km) across the university campus.

To entangle the electrons, the team excited the electrons on either side of campus in such a way that the spin — the tiny bar-magnetlike orientation of the electron — was either "up" or "down." Each of the excited electrons then emitted a photon , and both of these photons traveled to a beam splitter roughly in the middle and arrived at exactly the same time. The beam splitter has an equal chance of either reflecting or transmitting both photons, essentially making it impossible to tell which side of campus the photons came from. Once the photons were detected at the beam splitter, the team measured the electrons on either side of campus to see whether their spins were correlated. Sure enough, the team found the electron correlation was high enough to bolster the notion of spooky action at a distance.

The new result closes both loopholes because detecting the initial spin state of the electrons is detected 100 percent of the time — they are sitting in the diamond the entire time, Hanson said. In addition, the two diamonds are sufficiently far apart that there's no chance for the two electrons to communicate in the time it takes to do the measurement, he added.

Unfinished business

Hanson's new results beautifully close the first two loopholes, said David Kaiser, a physicist at the Massachusetts Institute of Technology in Cambridge, who was not involved in either of the two new experiments.

However, there's still one loophole left, he said.

Everyone uses some sort of random number generator to decide a particle's state, Kaiser said. But what if those random numbers weren't truly random?

The third loophole asks: "Did any process in the past of this whole experiment nudge or bias or somehow skew the set of questions that would be asked?" Kaiser told Live Science.

So Kaiser and Zeilinger are devising a test that, they say, would shrink that third loophole considerably. The team would derive its random numbers from luminous regions near galactic centers called quasars ,which are so distant that the light from them has taken 11 billion to 12 billion years to reach Earth. While that doesn't completely eliminate the loophole — after all, the random numbers could have been rigged at the universe's fiery birth — it gets it pretty close, Kaiser said. [ Beyond Higgs: 5 Particles That May Lurk in the Universe ]

Still, not everyone thinks that setup actually gets closer to closing the third loophole.

"Whatever setup you make, you just cannot prove that some signals were not predetermined before you saw them," Hanson said. "At the deepest fundamental level, this loophole cannot be closed."

Beyond that, the starlight method assumes the light from the quasars couldn't have been messed with by some hidden variables on its long journey to Earth, Hanson added. While that seems like a long shot, it seems equally paranoid to believe that another type of random-number generator is somehow rigged, he added.

(On Nov. 10, researchers at the National Institute of Standards and Technology in Boulder, Colorado published a paper in the preprint journal arXiv claiming they had demonstrated quantum entanglement with all three loopholes closed. However, that paper has not yet been subject to peer review, the standard process for vetting scientific claims, and it actually uses a similar approach and similar random number generators to those used in Hanson's experiments, so it also doesn't get any closer to eliminating that third loophole, Hanson said.)

Long-term applications

At this point, it's fair to ask: Why spend all these resources testing a premise that almost all physicists believe is true?

Hanson, Kaiser, Zeilinger and others don't expect their loophole-free tests to change the fundamental understanding of subatomic physics. Rather, the long-term applications may have more to do with the future of computing. Quantum encryption , which could one day become a perfectly secure method of encryption, relies on the understanding of quantum mechanics as scientists know it today.

Extending the length across which particles can be entangled could also have cool applications, Hanson said.

"Many people said this is going to be the end of this very long history, but I'm more excited about the beginning of the new field," Hanson said.

Follow Tia Ghose on Twitter and Google+ .   Follow   Live Science @livescience , Facebook   & Google+ . Original article on  Live Science .

Sign up for the Live Science daily newsletter now

Get the world’s most fascinating discoveries delivered straight to your inbox.

Tia Ghose

Tia is the managing editor and was previously a senior writer for Live Science. Her work has appeared in Scientific American, Wired.com and other outlets. She holds a master's degree in bioengineering from the University of Washington, a graduate certificate in science writing from UC Santa Cruz and a bachelor's degree in mechanical engineering from the University of Texas at Austin. Tia was part of a team at the Milwaukee Journal Sentinel that published the Empty Cradles series on preterm births, which won multiple awards, including the 2012 Casey Medal for Meritorious Journalism.

World's largest gravitational wave observatory squeezes light beyond the 'quantum limit'

Quantum Physics

'A big cosmological mystery': Newfound cosmic corkscrew defies our understanding of the universe

Most Popular

By Robert Lea January 11, 2024

By Jennifer Nalewicki January 11, 2024

By Melissa Hobson January 11, 2024

By Sascha Pare January 11, 2024

By Wei Gordon, Nadav Ahituv January 11, 2024

By Harry Baker January 11, 2024

By Owen Jarus January 11, 2024

By Marta Zaraska January 11, 2024

By Emily Cooke January 11, 2024

By Kristina Killgrove January 11, 2024

By Chelsea Black January 11, 2024

  • 2 James Webb telescope finds 'vanishing' galaxy from the dawn of the universe
  • 3 Intergalactic 'stream of stars' 10 times longer than the Milky Way is the 1st of its kind ever spotted
  • 4 Temple linked to Hercules and Alexander the Great discovered in ancient megacity in Iraq
  • 5 6 million-year-old 'fossil groundwater pool' discovered deep beneath Sicilian mountains
  • 2 How many times has Earth orbited the sun?
  • 3 Clouded leopard: The cat with saber-like teeth that can walk upside down in trees
  • 4 Inflammation is a 'mismatch between our evolutionary history and modern environment,' says immunologist Ruslan Medzhitov
  • 5 'If you don't have inflammation, then you'll die': How scientists are reprogramming the body's natural superpower

Book cover

  • © 2022

Quantum Computing Compact

Spooky Action at a Distance and Teleportation Easy to Understand

  • Bettina Just 0

THM Technische Hochschule Mittelhessen, Gießen, Germany

You can also search for this author in PubMed   Google Scholar

The "spooky action at a distance" questioned by Einstein made understandable

Completely new explanatory methodology, tested with students

Easy to understand due to catchy illustrations and overall presentation

3513 Accesses

  • Table of contents

About this book

Authors and affiliations, about the author, bibliographic information.

  • Publish with us

Buying options

  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Other ways to access

This is a preview of subscription content, log in via an institution to check for access.

Table of contents (12 chapters)

Front matter, introduction.

Bettina Just

Quantum Entanglement

Photons as qubits, the first experiment: independence, the second experiment: equality, the third experiment: spooky action at a distance, evaluations and interpretations, quantum computing with the example of teleportation, quantum algorithms vividly, quantum bits and quantum registers, quantum gates on one qubit, cnot: a quantum gate on two qubits, teleportation, further quantum algorithms and hardware.

What is the phenomenon of quantum entanglement? If you read popular science literature, there is talk of socks that are red and blue at the same time, but monochromatic - how is that supposed to work? If you read scientific literature, you have to have knowledge of functional analysis.

This book vividly builds the bridge between the experiments that led to quantum entanglement and the algorithm for teleportation, assuming only an elementary knowledge of mathematics.

  • Quantum entanglement
  • Experiments
  • Spooky action at a distance
  • Bell's inequality
  • illustrative explanation

Book Title : Quantum Computing Compact

Book Subtitle : Spooky Action at a Distance and Teleportation Easy to Understand

Authors : Bettina Just

DOI : https://doi.org/10.1007/978-3-662-65008-0

Publisher : Springer Berlin, Heidelberg

eBook Packages : Computer Science , Computer Science (R0)

Copyright Information : Springer-Verlag GmbH Germany, part of Springer Nature 2022

Softcover ISBN : 978-3-662-65007-3 Published: 02 January 2023

eBook ISBN : 978-3-662-65008-0 Published: 01 January 2023

Edition Number : 1

Number of Pages : XIII, 103

Number of Illustrations : 3 b/w illustrations, 54 illustrations in colour

Topics : Numerical Analysis

Policies and ethics

  • Find a journal
  • Track your research

quantum computer spooky action at a distance

  • Previous Article
  • 1 0000000404811396 https://isni.org/isni/0000000404811396 International Monetary Fund

Contributor Notes

The era of quantum computing is about to begin, with profound implications for the global economy and the financial system. Rapid development of quantum computing brings both benefits and risks. Quantum computers can revolutionize industries and fields that require significant computing power, including modeling financial markets, designing new effective medicines and vaccines, and empowering artificial intelligence, as well as creating a new and secure way of communication (quantum Internet). But they would also crack many of the current encryption algorithms and threaten financial stability by compromising the security of mobile banking, e-commerce, fintech, digital currencies, and Internet information exchange. While the work on quantum-safe encryption is still in progress, financial institutions should take steps now to prepare for the cryptographic transition, by assessing future and retroactive risks from quantum computers, taking an inventory of their cryptographic algorithms (especially public keys), and building cryptographic agility to improve the overall cybersecurity resilience.

“ I cannot seriously believe in it [...] physics should represent a reality in time and space, free from spooky action at a distance. ”

Albert Einstein 2

  • I. Introduction

The quantum revolution is underway, with the pace of innovations accelerating in recent years. The most notable and much discussed example of quantum technology is quantum computing—the use of quantum physics to perform calculations that are intractable for even the most powerful current and future classical supercomputers. 3 Leading technological companies have already developed working prototypes of quantum computers and provided access to them for researchers through their cloud services. Around the world, dozens of known projects are underway, from major corporations to startups and universities, to build quantum systems using different core technologies. If one of them overcomes current technological obstacles and creates a fully functional quantum computer or finds a way to use the existing models to solve practical computational tasks that are beyond the limits of conventional computers, it would have profound implications.

Quantum computing has the potential to transform the global economy and the financial sector, by accelerating scientific discovery and innovation. Fully functional quantum computers—when they appear—should revolutionize industries and fields that require significant computing power for simulations and optimizations that are too complex for conventional computers. For the financial system, quantum machines can greatly reduce the time to analyze complex risk positions or run Monte Carlo simulations, as well as increase their accuracy. Quantum computing can also speed up machine learning and artificial intelligence.

Beyond computing, quantum technologies give rise to novel ways of fast and secure data transmission (i.e., quantum Internet), which has been successfully tested, and, at least in theory, will be unbreakable. Yet another long-term prospect is quantum cryptography, which could enhance cybersecurity.

However, quantum computers would also crack many cryptographic algorithms underpinning today’s cybersecurity. Algorithms enabling security of the financial system, including Internet communications, mobile banking transactions, and digital currencies and distributed ledger technologies, could become obsolete or would require a significant upgrade. For some applications it may be already too late because of retroactive risks presented by quantum computers, as any information assumed secure today can be captured and stored, and then deciphered once efficient quantum computers are created. 4 Infact, almost any encrypted personal or financial message sent and recorded today may be deciphered by a powerful quantum computer in the future. Most financial institutions and regulators have not internalized these novel risks yet.

While waiting for quantum-safe encryption standards, financial system regulators can play an important role by raising awareness of potential risks. Financial institutions should take steps now to prepare for a cryptographic transition. They should assess future and retroactive risks from quantum computers, including from information that has already been captured or that may be captured now, stored and exploited years later. Financial institutions should develop plans to migrate current cryptography to quantum-resistant algorithms. As a first step, they should take an inventory of public-key cryptography used within the institution, as well as by partners and third-party suppliers. These will eventually need to be transitioned to post-quantum cryptography once standards are available. And finally, they should build cryptographic agility to improve the overall cybersecurity resilience going forward. Past experiences of algorithm replacements, even though much simpler than the transition to post-quantum standards, show that they can be extremely disruptive and often take years or decades to accomplish. Therefore, the time for action is now.

The rest of the paper is organized as follows. Section II describes key concepts of quantum computing, sections III and IV discuss potential benefits and risks of quantum computers, and section V summarizes the main messages and presents the way forward. To complete the picture, paper’s annexes provide a glossary of technical terms ( Annex I ), a brief history of encryption, cryptoanalysis and digital computers ( Annex II ), and a description of the main cryptographic algorithms currently in use and their vulnerabilities ( Annexes III and IV ).

  • II. What is Quantum Computing?

Quantum computing is the use of quantum phenomena such as superposition a n d entanglement to perform computations. The basic unit of a quantum computer is qubit ( short for quantum bit) , typically realized by quantum properties of subatomic particles, like the spin of electrons or the polarization of a photon. While each bit, its counterpart in digital computers, represents a value of either zero or one, qubits represent both zero and one (or some combination of both) at the same time, a phenomenon called superposition. Quantum entanglement is a special connection between pairs or groups of quantum elements, whereas changing the state of one element affects other entangled elements instantly, regardless of the distance between them. This is a so counterintuitive phenomenon that Albert Einstein famously derided entanglement as “spooky action at a distance” ( Macmillan, 1971 ). By entangling qubits, the number of represented states rises exponentially, making it possible to explore a huge number of possibilities instantly and conduct parallel calculations on a scale that is beyond the reach of traditional computers. Thanks to superposition and entanglement, adding just a few extra fully functioning qubits can lead to exponential leaps in processing power.

Theoretically, quantum computers can outpace current (and future) traditional computers, the so-called quantum “supremacy ” or quantum advantage . It is possible to model quantum computers’ states with traditional computers, but the resources required for it rise exponentially. One qubit can have values of zero and one at the same time and can be modeled with two traditional logical bits each holding values of zero or one. For two qubits, four traditional bits are needed; for three qubits, eight bits, and so on. To model a quantum computer with 54 qubits, one would need 2 54 = 18,014,398,509,481,984, which is about 18 quadrillion bits of traditional logical memory. As of end-20 19, there was only one supercomputer in the world that had such a large memory— Summit (OLCF-4) supercomputer developed by IBM for Oak Ridge National Laboratory. To model a quantum computer with 72 qubits, one would need 2 72 , about 5 Sextillion bits. This can be achieved, for example, by stacking together 262 thousand Summit-type supercomputers. A 100-qu bit quantum computer would require more bits than all atoms of planet earth, and a 280-qubits would require more bits than all atoms in the known universe. These numerical examples illustrate the exponential power of quantum computers.

Quantum computers are not only more powerful, they are also fundamentally different from today’s digital computers. They require different algorithms and infrastructure to solve existing and new mathematical problems. For illustration purposes, some complex computational tasks could be compared to a maze (e.g., finding the fastest route between two cities or the most efficient supply chain). This maze has multitude of ways leading nowhere and only one leading to the exit. Traditional computer tries to solve this problem the same way we might try to escape a maze—by trying every possible corridor and turning back at dead ends until we eventually find the way out. This can take very long time. But superposition allows a quantum computer to try all the possible paths at once (i.e., quantum parallelism ). This drastically reduces the time needed to find the solution, the so-called quantum speedup .

The quantum speedup depends, among other things, on the computational problems and the algorithms used. Grover’s and Shor’s algorithms are the two best known quantum algorithms. They yield a polynomial speedup and an exponential speedup, respectively, over their classical counterparts ( Kothari, 2020 ). A polynomial speedup is when a quantum computer solves a problem in time T, but a classical computer needs time T 2 . For example, Grover’s algorithm can solve a problem on a quantum computer with 1,000 steps that would take 1,000,000 steps on a classical computer. This type of algorithms can be used for the so-called NP-complete problems, described as looking for a needle in an exponentially large haystack (e .g., finding symmetric keys and hash functions). An exponential speedup is where a quantum computer takes time T but a classical computer takes time 2 T . If T is 100, there is huge difference between 100 and 2 100 —more than all atoms of planet earth. This type of algorithms includes Shor’s algorithm, which can break asymmetric (public) keys . Such impressive speedups are one of the most promising and compelling aspects of quantum computers.

Motivated by their potential power, researchers from leading technological companies are developing working prototypes of quantum computers. In 2019, Google engineers used their quantum machine powered by 54-qu bit Sycamore processor—which had 53 qubits working at that moment—to perform a specific computation task in just 200 seconds, while they estimated that the most powerful digital supercomputer available at that time would take 10,000 years to execute that task. Google engineers presented it as proof of quantum “supremacy ”, which is the confirmation that quantum computers may perform tasks virtually impossible for traditional computers ( Arute et al., 2019 ). A competing research team from IBM disputed Google’s claims, while promoting their own quantum computers. IBM claims that Google’s estimates are inaccurate, and that the world’s fastest computer, Summit —built by IBM—could be modified to obtain the same results in about 3 days ( Pednault et al., 2019 ), though they have not shown that in practice. Cementing claims for quantum advantage, in December 2020 a team of researchers from the University of Science and Technology of China in Hefei announced that their photon quantum computer, named Jiuzhang , performed in 200 seconds a calculation that on one of the most powerful supercomputers in the world would take 2.5 billion years to complete ( Zhong et al., 2020 ). Importantly, they carried out the task on a photonic quantum computer working at room temperature.

Alongside, many other technological companies—from industry leaders to start-ups a n d universities—are working on quantum computers, increasing the probability of a breakthrough. As of January 2021, IBM has deployed 28 quantum computers for public and commercial use through its cloud services. In September 2020, IBM released a roadmap to produce a 1,000-plus qubit device called Quantum Condor by the end of 2023. Effectively, it means doubling or tripling the number of cubits in the quantum computer each year. Microsoft and Amazon also have launched beta versions of quantum computing cloud services—Microsoft Azure and AWS Bracket —powered by suppliers such as 1Qbit, Rigetti, IonQ, and D-Wave. Around the world, there are at least 87 known projects underway to build quantum systems using different core technologies. 5

To reap the benefits of quantum computing, researchers need to build quantum machines that compute with lower error rates. Superposition and entanglement are fragile states. The interaction of qubits with the environment produces computation errors. Any external disturbances or noise, such as heat, light or vibrations, inevitably yanks qubits out of their quantum state and turns them into regular bits. Classical computers are also prone to random computational errors, albeit in much lower rates. By employing redundancy, error correction processes enable classical computers to produce practical, error-free computations. However, such techniques are not applicable to quantum physics because of the no-cloning principle : it is physically impossible to copy the running state of a qubit.

In 1994, Peter Shor proposed a theoretical quantum error correcting code, achieved by storing the information of one qubit onto a highly entangled state of several qubits. This scheme uses many ordinary qubits to create a single error-free entity: the formers are denominated as physical qubits , whereas the latter as logical qubits . But just adding more qubits might not boost a machine’s performance. The frequency of errors in delicate qubits and their operations, caused by noises, tends to increase as more qubits are connected. IBM has developed the concept of quantum volume to measure progress in quantum computing, which adjusts the number of qubits, among other things, for error rate and the quality of connectivity between qubits. 6 IBM expects that quantum volume will be more than doubling every year. Today’s quantum devices have error rates that are too high, which are one of the most pressing issues for quantum computers.

The race to build better quantum computers is intensifying, with companies using different technologies. It is possible to classify early quantum computing hardware community into two general categories or types. First, quantum computers based on the quantum gates and quantum circuits are the most similar to our current classical computers based on logical gates . 7 The other great family of quantum computers are analog quantum computers. These quantum computers directly manipulate the interactions between qubits without breaking these actions into gate operations. The best-kn own analog machines are quantum annealers . Some experimental quantum annealers are already commercially available, the most prominent example is the D-Wave processor, with over 5,000 qubits. This machine has been heavily tested in laboratories and companies worldwide, including Google, LANL, Texas A&M, USC. Companies are also using several strategies to implement physical qubits. For example, Alibaba, IBM, Google, D-Wave, and Rigetti use superconducting qubits , IonQ uses trapped ion qubits , while Xanadu and the University of Science and Technology of China are developing photonic quantum computers.

For the foreseeable future, quantum computers are expected to complement, not replace, classical computers. While desk quantum computers are far away, public can already have access to quantum computing through cloud services provided by companies such as IBM and D-Wave. People can use their classical computers to perform calculations on quantum computers and receive the results back on their classical computers. In the near future, quantum applications would probably be hybrid, since quantum and classical computing technologies have complementary strengths ( National Academies of Sciences, 2019 ).

  • III. Potential Benefits of Quantum Computing

Quantum computers can transform the financial system, as they can solve many problems considerably faster and more accurately than the most powerful classical computers. Simulation, optimization, and machine learning (ML) are three areas where quantum computers can have an advantage over classical computers ( Bouland et al. 2020 ; Egger et al., 2020 ; and Orus et al. 2019 ):

Simulations: Monte Carlo-based methods. The use of simulations by the financial sector is ubiquitous. For example, Monte Carlo methods are used to price financial instruments and to manage risks. However, Monte Carlo simulations are computationally intensive, often leading to tradeoffs between accuracy and efficiency. Quantum computing could perform simulations such as pricing and risk management almost in real time, without the need to take unrealistic assumptions to simplify the models.

Optimization models. Financial institutions make myriad of optimization calculations every day. For example, to determine the best investment strategy for a portfolio of assets, allocate capital, manage cash in ATM networks, or increase productivity. Some of these optimization problems are hard, if not impossible, for traditional computers to tackle. Approximations are used to solve the problems within a reasonable time frame. Quantum computers could perform much more accurate optimizations in a fraction of the time without the necessity to use approximations.

Machine learning (ML) methods, including neural networks and deep learning. Financial institutions are increasingly using ML. Examples include estimating the risk level of loans by credit scoring and detecting frauds by finding patterns that deviate from normal behavior. However, such ML tasks face the curse of dimensionality . The time needed to train an ML algorithm on classical computers increases exponentially with the number of dimensions considered. Even if the classical computer can handle these tasks, it would take too much time. Quantum computers have the potential to outperform classical algorithms by accelerating ML tasks (quantum speedup), enabling them to tackle more complex analyses while increasing accuracy.

Beyond finance, quantum computing has the potential to be a catalyst for scientific discovery and innovation. An important application of quantum computing is for models of particle physics, which are often extraordinarily complex and require vast amounts of computing time for numerical simulation. Quantum computers would enable precision modeling of molecular interactions and finding optimal configurations for chemical reactions. They can transform areas such as energy storage, chemical engineering, material science, drug discovery and vaccines, simulation, optimization, and machine learning. Specifically, this would allow the design of new materials such as lightweight batteries for cars and airplanes, or new catalysts that can produce fertilizers more efficiently—a process which today accounts for over 2 percent of the world’s carbon emissions ( Martinis and Boixo, 2019 ). Quantum computers could also improve weather forecasts, optimize traffic routes and supply chains, and help us better understand climate change.

Beyond computing, quantum technologies give rise to novel ways of data transmission, storing and manipulating. Quantum networks can transmit information in the form of entangled qubits between remote quantum processors almost instantaneously ( quantum teleportation ) and securely using quantum key distribution (QKD). Until recently, such networks could function only in laboratory conditions, but experiments confirmed their viability for long-distance secure communications ( Boaron et al., 2018 ). Moreover, data could be transmitted wirelessly through quantum satellite in space. Scientists in China were able to transmit data using quantum satellite launched in 2016 between mobile ground station in Jinan (in north-east China) and a fixed station in Shanghai. ICBC bank and the People’s Bank of China are using satellite-based QKD for information exchanges between distant cities, such as Beijing and Urumqi in the far north-west. 8 9 In the Netherlands, a team from Delft University of Technology is building a network connecting four cities with quantum technology. They have demonstrated that it can send entangled quantum particles over long distances. 10 In the U.S., a consortium of major institutions led by Caltech have demonstrated sustained, high-fidelity quantum teleportation over long distances. They achieved the successful teleportation of qubits across 44 kilometers of fiber in two testbeds: the Caltech Quantum Network and the Fermilab Quantum Network. 11

Another promising venue is quantum sensing devices. Advances have been reported in quantum radar, imaging, metrology, and navigation, which would enable greater precision and sensitivity. For example, medicine has started to reap the benefits of quantum sensors, by revolutionizing the detection and treatment of diseases. In the U.S., the Defense Advanced Research Projects Agency (DARPA) is running the Quantum-Assisted Sensing and Readout (QuASAR) program. Building on established control and readout techniques from atomic physics, it aims to develop a suite of measurement devices that could find application in the areas of biological imaging, inertial navigation and robust global positioning systems. 12

  • IV. Potential Risks of Quantum Computing

While quantum computing has tremendous potential to benefit the society, it brings new risks and challenges. The massive computing power of quantum machines threatens modern cryptography, with far-reaching implications for the financial stability and privacy. Quantum computers can solve what is known in complexity theory as hard mathematical problems exponentially faster than the most powerful classical supercomputers, potentially making today’s main cryptographic standards obsolete. In particular, quantum computing has the potential to make asymmetric cryptography ( public-key cryptography ) obsolete, while reducing the strength of other cryptographic keys and hashes .

Today’s cryptography is based on three main types of algorithms: symmetric keys , asymmetric (public) keys , and algorithmic hash functions , or hashing (see Annex III and IV for further descriptions). These cryptographic algorithms, for the most part, have had the upper hand in maintaining the necessary security to protect data, provide integrity checks and digital signatures. They are generally deemed secure and unbreakable with today’s most advanced hardware and cryptanalysis techniques using conventional computers.

With symmetric-key encryption, an attacker needs to find the secret key shared between the sender and receiver to decrypt the cipher message as shown in Figure 1 (top panel). 13 Conversely, with public-key encryption, the attacker needs to find the receivers’ private key, knowing their public key, to decrypt the message (middle panel). Asymmetric encryption algorithms are widely used to secure communications over the Internet. Successful attacks against these standard cryptographic algorithms would compromise secure connections, endangering the security of banking, e-commerce, and other services. With hash functions (bottom panel), an attacker would attempt to find a hash-collision to match the output digest with a crafted and different input, allowing to produce counterfeit authentication digests for transactions or documents.

Figure 1:

Types of Cryptographic Algorithms

Citation: IMF Working Papers 2021, 071; 10.5089/9781513572727.001.A001

  • Download Figure
  • Download figure as PowerPoint slide

Risks from quantum computing vary depending on the types of cryptographic algorithms:

Symmetric cryptography, under certain conditions, is believed to be quantum resistant. Current security standards recommend the usage of AES algorithm with 256 bits keys for symmetric encryption. Known as AES 256, this algorithm is widely used for multiple purposes, such as securing Internet websites or wireless networks. An attacker would have to try 2 256 combinations to break a 256-bit AES key using brute force, an effort that would require a timespan of over 7 billion years to be executed by a classical supercomputer, half the current age of the universe ( CISA, 2019 ). A quantum computer may reduce the complexity of breaking symmetric encryption key by half, for example, by using Grover’s algorithm ( Grassl et al., 2015 ). However, it would still have to run for millions of years to break a single AES key using known methods. This leads most experts to believe that that algorithm is quantum resistant for now, and so are other symmetric encryption methods of the similar nature.

Hashing functions are also believed to be quantum resistant under determined conditions. Hashing generates unique fixed-size codes according to arbitrary inputs. They are used to validate information and are leveraged in several cryptographic methods for diverse purposes, such as validating information or generating authentication codes. Their novelty stems from the quasi impossibility to reverse them. Given a determined hash code, it would take thousands of years to produce inputs that generate the same code (this is called a collision attack ). As with symmetric cryptography, using Grover’s algorithm, a quantum computer could reduce the time to reverse a hash function from 2 n to 2 n/2 , n being the number of bits used for the hash output. Therefore, longer hash functions like the SHA-3 family, which typically generate 256-bits outputs, are considered quantum safe and expected to remain as approved standards for now.

Public (or asymmetric) keys, however, can become obsolete with quantum computing. Theoretically, a fully functioning quantum computer can break an asymmetric key in a few hours by using Shor’s algorithm and related optimizations ( Gidney et al., 2019 ). Furthermore, researchers believe that advancements in quantum computing will reach a level of optimization that would allow quantum computers to break today’s public keys in less time than it takes to generate them using digital computers ( Monz et al, 2016 and Anschuetz et al, 2018).

Critical protocols behind digital data and communication security of the financial sector rely heavily on public-key cryptography. In the age of the Internet, public keys aim to achieve critical security services underpinning the financial sector. These include ( Burr and Lyons-Burke, 1999 ): (i) authentication/authorization (the ability to corroborate the identity of a party that originated particular data, transaction, or participates in a protocol); (ii) privacy/confidentiality (the ability to ensure that unauthorized individuals are not able to access protected data); and (iii) integrity (the ability to know that data has not been altered). For example, today’s digital certificates and digital signatures are based on asymmetric keys. These critical security services supporting the financial sector would be compromised by a sufficiently powerful quantum computer, threatening sensitive information managed and communicated by financial institutions an d central banks. Putting it simply, an attacker who can forge signatures can effectively spend other people’s funds or masquerade as any entity.

Figure 2 shows some potential impacts of quantum computers on the different communication protocols used by the financial system:

1 . Online/Mobile Banking. Using a quantum computer, an attacker may compromise public keys for standard Internet protocols and eavesdrop on any communications between users and financial institutions. Furthermore, an attacker may compromise the authentication and authorization schemes, whether it’s session-toke n or public-key based financial system to produce counterfeit transactions. Moreover, in the case of central bank digital currencies (CBDC) and blockchain networks, attackers may extract valid wallet keys from publicly available records, granting them the ability to appropriate of users’ credits and tokens.

2 . Payment Transactions and Cash Withdrawals. ATMs are connected through private networks. This makes it easy for attackers to tap into connections relying on public-key encryption and use the same venues applicable to online or mobile banking to forge transactions.

3 . Business to Business Privacy. Corporate point-to-point networks also use public-key encryption to build secure channels, authenticate and authorize data exchanges between businesses. By compromising such channels, attackers would have full access to information that, once captured, would allow them easy points of entry to invade corporate internal networks, by impersonating users or servers through man-in-the-middle attacks. By forging certificates, for instance, attackers would be able to add their own resources to the enterprise network. Another form of attacks may be to record available encrypted data now, and decrypt it once a quantum computer is available, allowing them to reveal current trade secrets in the future, for instance.

4 . VPN Communications. VPN connections are used by staff of financial institutions to work from home and to access organizational internal and sensitive resources. Such connections typically use public-key encryption to authenticate business and workstations which would be vulnerable to the same issues as the business-to-business connections.

Figure 2:

Quantum Computing: Selected Risks to the Financial Sector

Other applications relying on public-key cryptography include popular blockchain-based digital assets such as Bitcoin or Ethereum and password-protected web applications. The best known of these protocols is HTTPS, used by 96 percent of Internet websites ( Google Report, 2020 ). Therefore, quantum computing is an existential threat to many business sectors that rely on asymmetric cryptography for their day-to-day operations ( ETSI, 2020 ).

While the ability to use longer keys renders symmetric encryption and hashing quantum-safe today, they are not immune to further advances in quantum computing. As the quantum computing field becomes widely researched and understood, new schemes and algorithms emerge continuously. Shor’s algorithm, for instance, has been improved several times since its inception, mainly to reduce its processing requirements. New algorithms and analysis are created that significantly lessen the quantum hardware capability needed to solve problems that go beyond the realm of classical supercomputers ( Cade, 2020 ). It is, therefore, reasonable to assume that, as research progresses, new algorithms would be discovered to target today’s advanced symmetric cryptography and cryptographic hashing functions and turning them obsolete, as in the case of public-key cryptography.

Achieving a quantum-safe environment will require a different mindset by governments, firms, and individuals. More than 50 percent of organizations, including government agencies, admit running outdated software. 14 Past experiences with replacing the data encryption standard (DES) and various hash functions (SHA-1, MD5) suggest that it takes at least a decade to replace a widely deployed cryptographic algorithm ( National Academies of Sciences, 2019 ). Migration to quantum-resistant algorithms is likely to be much more complex than previous experiences, given the ubiquitous use of public keys. Therefore, even if all product providers made their software quantum-resistant, public and private organizations alike would need a different approach to obsolescence management. This would be even more complicated and expensive for legacy systems that no longer have software updates issued by their manufacturers.

  • V. The Way Forward

We are on the threshold of the quantum computing age. Quantum computers can speed up the process of scientific discovery, from designing new materials for more efficient batteries to creating better drugs and vaccines. Quantum computers could also transform the financial system as they would solve many problems considerably faster and more accurately than the most powerful classical supercomputers. Leveraging on quantum computers’ potential will also require new approaches and algorithms. This includes developing new error-correction schemes, creating new programming languages, forming communities of potential users, and developing common standards to ensure the interoperability between different quantum computing approaches and communications.

Quantum computers may also cause substantial disruptions, including undermining the financial stability. An important risk of quantum computing relates to the existing encryption algorithms that could become obsolete, especially the widely used public-key algorithms. Cryptoanalysis history is full of cautionary tales about perceived unbreakable cryptography made obsolete by new technologies ( Annex II ). The race has already started to develop new quantum-safe encryption standards and algorithms. For example, in the U.S., the National Institute of Standards and Technology (NIST) is running a competition for a quantum-safe encryption algorithm, targeting to announce a winner by 2024 ( NIST, 2020 ). If fully functional quantum computers become a reality before or shortly after that, organizations (firms and governments) would have a narrow window to mitigate this risk. In Europe, the European Telecommunication Standards Institute (ETSI) is spearheading deployment of quantum-safe standards ( ETSI, 2015 , 2017 , 2020 ). These works feed into activities of other standard-setting bodies such as the International Telecommunications Union (ITU) and the Internet Engineering Task Force (IETF).

While waiting for quantum resistant standards, financial system’s regulators can play an important role by raising awareness of the financial community to the current and forthcoming risks and challenges. First, financial institutions should develop plans to migrate current cryptography to quantum-resistant algorithms. ETSI (2020) has outlined a framework of actions that an organization should take to enable migration to a quantum-safe cryptographic state. The framework comprises three stages: (i) inventory compilation, (ii) preparation of the migration plan, and (iii) migration execution:

Inventory compilation. An organization cannot plan migration without prior knowledge of its assets that quantum computing would affect. Thus, the first stage of migration is to identify the set of cryptographic assets (both hardware and software) and processes in the system. The framework would require managing the business process, allocating a budget and ensuring accountability. The costs could be significant, including financial, temporal, organizational and for technical provisions.

Preparation of the migration plan. The migration plan would determine whether an asset identified in stage 1 will be migrated or retired, as some assets may become obsolete through redesign. Sequencing the migration is important given the interdependency of assets. If backwards compatibility is required during the migration, then the application will have to support both classical and quantum-safe algorithms. This may be achieved by using individual classical and quantum -safe algorithms, or by using hybrid algorithms depending on the existing cryptographic agility. For example, in November 2020, IBM announced plans to add quantum-safe cryptography to its cloud services, on top of the current standards. 15 Provisions for cryptographic agility should be considered for any new or updated cryptography. If a vulnerability is found in the quantum-safe algorithm, it may be necessary to switch to a different one, although sometimes the vulnerability may be addressed by patches and updates. Ensuring cryptographic agility will make these upgrades easier.

Migration execution. The role of this stage is to implement the migration plan from stage 2 against the inventory from stage 1. This stage also includes mitigation management. A key element of mitigation management is conducting exercises to simulate and test the migration plan to determine its viability. These exercises are important, as they can uncover missing inventory elements (it is probable that the inventory will be incomplete).

This framework assumes an orderly, planned migration. However, immediate availability of a viable quantum computer that is used to attack public keys could require immediate transition to a quantum-safe cryptography. In this case, an emergency migration could require quick simultaneous execution of key measures outlined above.

Given the pace of innovations and uncertainty about when quantum-safe standards become available, financial institutions should build cryptographic agility. This is a property that permits smooth changing or upgrading cryptographic algorithms or parameters to improve the overall cybersecurity resilience in the future. Over the longer term, there may be a need to implement quantum cryptographic methods to reduce cybersecurity risks.

Beyond the financial stability, quantum computing raises important privacy risks, and regulators should work with industry experts to understand these risks. Regulations such as the United States Gramm-Leach-Bliley Act ( Gramm-Leach, 1999 ), or the European’s General Data Protection Regulation ( GDPR, 2018 ) already guide the protection of information, but may require further scrutiny to ensure quantum-resistant encryption of data exchange and storage. Importantly, given that quantum computers represent retroactive risks, the time for action is now.

The IMF has an important role to play in raising the awareness of its members about financial stability risks from quantum computers an d promoting quantum-safe standards and practices. At the multilateral level, IMF should encourage member countries to collaborate closely in developing common standards and protocols to ensure interoperability. At the bilateral level, it should encourage country authorities to develop encryption migration plans in the financial sector surveillance, for example, as part of the dialogue on ensuring operational resilience of financial institutions, markets, and infrastructure.

  • Annex I. Glossary of Technical Terms Used in the Paper

Cryptanalysis studies the encrypted secret message ( ciphertext ) to gain as much information as possible about the original message.

Cryptography is the science of transmitting secret information using public channels. A cryptologic system performs transformations on a message, the plaintext , and uses a key to render it unintelligible, producing a new version of the message, the ciphertext . To reverse the process, the system performs inverse transformations to recover the plaintext, decrypting the ciphertext ( Dooley, 2018 ).

Cryptographic agility (or crypto agility ) is the property that permits changing or upgrading cryptographic algorithms or parameters. While not specific to quantum computing, crypto agility would make defense against quantum computers easier by allowing substitution of today’s quantum-vulnerable public-key algorithms with quantum-resistant algorithms.

HTTPS (Hypertext Transfer Protocol Secure) is a Web communication protocol used between network devices for secure communication. It encrypts both the information a user sends to a website, and the information that the website sends back—for example, credit card information, bank statements, and e-mail.

Quantum annealing is a process for finding the global minimum of a given objective function over a given set of candidate solutions (candidate states), by a process using quantum fluctuations. It finds an absolute minimum size/length/cost/distance from within a possibly very large, but nonetheless finite set of possible solutions using quantum fluctuation-based computation instead of classical computation.

Quantum computing is the use of a non-classical model of computation. Whereas traditional models of computing such as the Turing machine or Lambda calculus rely on classical representations of computational memory, a quantum computation could transform the memory into a quantum superposition of possible classical states. A quantum computer is a device that could perform such computation.

Quantum entanglement is a label for the observed physical phenomenon that occurs when a pair or group of particles is generated, interact, or share spatial proximity in a way such that the quantum state of each particle of the pair or group cannot be described independently of the state of the others, even when the particles are separated by a large distance.

Quantum gate is a basic quantum circuit operating on a small number of qubits. They are the building blocks of quantum circuits, like classical logic gates are for conventional digital circuits.

Quantum key distribution (QKD) is a secure communication method that implements a cryptographic protocol involving components of quantum mechanics. It enables two parties to produce a shared random secret key known only to them, which can then be used to encrypt and decrypt messages.

Quantum mechanics (also known as quantum physics , quantum theory, the wave mechanical model, or matrix mechanics) is a fundamental theory in physics which describes nature at the smallest scales, including atomic and subatomic.

Quantum superposition is a fundamental principle of quantum mechanics, where a system is in more than one state at a time. It states that, much like waves in classical physics, any two (or more) quantum states can be added together (“superposed”) and the result will be another valid quantum state; and conversely, that every quantum state can be represented as a sum of two or more other distinct states.

Quantum “supremacy” is demonstrating that a programmable quantum device can solve a problem that classical computers practically cannot (irrespective of the usefulness of the problem). By comparison, the weaker quantum advantage is demonstrating that a quantum device can solve a problem faster than classical computers. Using the term “supremacy” has been controversial, and quantum advantage is now often used for both descriptions. 16

Qubit or quantum bit is the basic unit of quantum information. It is the quantum version of the classical binary bit. A qubit is a two-state (or two-level) quantum-mechanical system, one of the simplest quantum systems displaying the peculiarity of quantum mechanics. It allows the qubit to be in a coherent superposition of both states/levels simultaneously, a property which is fundamental to quantum mechanics and quantum computing.

Symmetric key is an approach in cryptography when the same key must be used to either decrypt or encrypt a message. Asymmetric cryptography uses a pair of related keys, when one is used to encrypt a payload and the other to decrypt it. In public-key cryptography , users publish one of the keys, the public key , and keep the other secret, the private key . Then public key is used to encrypt the message and the private key is needed to decrypt it.

Annex II. A Brief History of Encryption, Cryptoanalysis and Digital Computers

  • Encryption and Cryptoanalysis

Since ancient times, cryptography has been a race between those trying to keep secrets and adversaries trying to uncover them. The earliest examples of transposition ciphers go back to at least 485 B.C., when the Greek soldiers would wrap a strip of papyrus around a staff, a scytale , write a message down its length, and send off the papyrus. The receivers could unscramble messages by wrapping them around another scytale of the same thickness. In this case, the staff’s shape represented the encryption key . The first known historical record of substitution cipher is from Roman Empire: Emperor Julius Caesar is believed to send encrypted messages to the orator Cicero replacing each letter by its third next down the alphabet. The Caesar cipher was broken as early as the 7 th century by Arab cryptographers, who documented the techniques of cryptoanalysis, the science of undoing ciphers ( Singh, 1999 ). In “A Manuscript on Deciphering Cryptographic Messages”, the philosopher al-Kindl observed that every language has a characteristic frequency of letters and sequences and that by capturing them using sample texts of that language, the cryptanalyst might decipher any message.

Simple substitutions became obsolete in the 1700s because of the proliferation of Black Chambers —offices kept by European nations for breaking ciphers and gathering intelligence. As Black Chambers industrialized cryptoanalysis, cryptographers were forced to adopt more elaborated substitutions by turning to polyalphabetic methods. Instead of referring to a single alphabet for encryption, cryptographers would switch between two alphabets for choosing replacement symbols. The Vigenère cipher , believed to be the first polyalphabetic method and also called Le Chiffre Indéchiffrable , was first described in 1553 and remained popular until it was broken in the 19 th century.

World War I intensified the need for secrecy. The radio had brought new capabilities to the field, such as the coordination of troops at a long distance. However, open waves also allowed enemies to listen to communications. Each nation used its own encryption methods. Some, like the Playfair cipher used by the British, remained unbroken during the war; others, like the German ADFGVX, were broken. In the period following the World War I, machines became the logical solution for the increase in the volume of material to decrypt. Several mechanical cryptographic devices were invented in the period preceding World War II, such as the M-94 cipher device used by the US military; the C-36 by the French Army; and the Enigma by the German Army ( Dooley, 2018 ). Also, several devices were invented to break their encryption. To break Enigma, Alan Turing—one of the inventors of the digital computer—created Bombes for the British secret operation center. Colossus , the first programmable computer based on Turing’s design, enabled the British to break the Lorenz cipher, which protected communications from the German high command. The US navy built fully automatic analog machines to break the cipher from Japan’s Purple device.

After World War II, digital computers dominated cryptography. Whereas mechanical devices are subject to physical limitations, computers operate at a much higher speed and scramble numbers, not letters, giving access to a large set of new operations. At the beginning of the 1960s, the transistor replaced the vacuum tube in digital circuits for computers and, at the end of that decade, the Internet was invented, kick-starting the current digital age. By the early 1970s, computers became available for business customers, which demanded secrecy capabilities from vendors. As regular citizens became computer users, cryptography became necessary, for instance, to enable credit card transactions or transmission of personal information through public networks. A plethora of new cryptographic schemes appeared, leading the American National Bureau of Standards to intervene in 1973 and open a public competition to choose a cryptographic standard for the United States. IBM’s Lucifer cipher, renamed Data Encryption Standard (DES), was elected as America’s official standard in 1977. After DES was broken in a public competition in 1997, it was replaced as standard by Triple-DES in 1999, and retired when NIST adopted Advanced Encryption Standard (AES) in the early 2000s.

Until mid-1970s, all cryptographic methods used symmetric keys : the same key must be used to either decrypt or encrypt a message. Thus, to use cryptography, senders and receivers had to share keys in advance, a complicated matter of logistics. Whitfield Diffie, Martin Hellman, and Ralph Merkle solved the problem in 1976 . The Diffie-Hellman key exchange allowed two parties to agree on a secret key using a public channel. The trio effectively created asymmetric cryptography , whereby operations are associated with a pair of related keys: when one is used to encrypt a payload, the other decrypts it and vice versa. Two years later, Rivest, Shamir and Adleman extended the concept with public-key cryptograp hy , whereby users publish one of the keys, the public key , and keep the other secret, the private key . Asymmetric methods enabled new applications. For instance, people may claim their identity by showing a plaintext message and the cipher produced by their private key, which could be verified by decrypting the cipher using their public key. Asymmetric cryptography (including RSA), also known as public-key cryptography, is widely used over the Internet, including by the financial system, for key exchanges, digital signatures, non-repudiation and authentication. Public and private keys also underpin digital currencies and blockchain technologies.

Asymmetric or public-key cryptography is the most vulnerable to quantum computing. Potential advantages of quantum computers became apparent in the early 1980s, when Richard Feynman pointed out essential difficulties in simulating quantum mechanical systems on classical computers, and suggested that building computers based on the principles of quantum mechanics would allow us to avoid those difficulties ( Nielsen, 2010 ). The idea was refined throughout the 1980s. In 1994, Peter Shor published an algorithm that would allow one to perform prime factorization much faster when using quantum properties. As prime numbers are used at the core of most asymmetrical cryptography methods, Shor’s algorithm used on quantum computers might render most Internet security invalid.

While quantum computing poses a threat to Internet security, quantum mechanics can also provide unbreakable cryptography. In the 1980s, researchers from IBM proposed a novel way to leverage photon polarization to perform key distribution. By using the laws of physics, Quantum Key Distribution (QKD) can become impenetrable because eavesdroppers cannot intercept communications without interfering with them. Such experimental systems have been implemented since the 1990s, but they are very far from commercial use.

  • Digital Computers

The origin of classical computers may be traced to 17 th century France. In the small town of Clermont-Ferrand, Blaise Pascal built the first machine that enabled humanity to manipulate numbers by mechanically performing the four basic arithmetic operations. Human ability to do math was enhanced again in 1822 by the English polymath Charles Babbage’s Difference Engine . It could tabulate polynomial functions, which enabled the mechanical approximation of complex calculations such as logarithmic or trigonometric functions. Babbage also designed a general-purpose computer, the Analytical Engine . However, the project was terminated due to engineering and funding issues, and a working engine was never built in Babbage’s lifetime. The next notable machines in history were differential analyzers , analog computers that use wheel-and -disc mechanisms to perform integration of differential equations. The first differential analyzer built at MIT by Vannevar Bush in 1931 played a particularly important role in history for inspiring one of Bush’s graduate students, Claude Shannon. In 1938, he invented digital circuits for his master thesis ( Shannon, 1938 ), proving that complex mathematical operations may be performed by running electricity through specific configurations of electronic components.

Shannon’s work was complemented by Alan Turing’s doctoral thesis. It came as an answer to the challenge produced by David Hilbert and Sir Bertrand Russel in the previous decade, the Entscheidungsproblem , or the halting problem: mathematicians should search for an algorithm to prove whether any statement is true in a system. The Turing Machine was an imaginary device composed of a mechanism that moves an infinite tape back and forth, writes symbols to it, and reads recorded symbols. The Church-Turing thesis then states that this device can compute any function on natural numbers as long as there is an effective method of obtaining its value. And, conversely, that such a method exists only if the device can compute that function.

Thus, engineering met mathematics: by the time Claude Shannon invented digital circuits, Turing had just designed the mathematical blueprint of a general-purpose computer. The resulting circuitry, Turing-complete digital computers, were capable of computing every function the imaginary machine can compute. While the Colossus , a war secret built by British intelligence to break Hitler’s communications, was the first in history, modern computers are based on the architecture designed within a team lead by John Von Neumann, first used in 1949’s EDVAC (Electronic Discrete Variable Automatic Computer). Contemporary digital devices are Turing-complete devices generally composed of processing units (e.g., CPU), storage devices (e.g., RAM/ROM and disk drives), and input and output mechanisms (e.g., keyboard and video). Desktop computers and smartphones follow this same design.

Once the design was invented, engineering advanced enormously in speeding up each of its components. For instance, vacuum tubes were prominent components of CPUs in early machines, needed for their singular capacity to control the direction of the flow of electrons through its terminals. However, tubes presented several challenges related to durability and reliability. They were replaced by transistors invented in the 1940s, which in turn were replaced by integrated circuits throughout the 1960s. Since then, performance and size of digital computers have been dictated by the technology of fabrication of integrated circuits. Since the 1960s such technologies have allowed us to double the number of components in each single integrated circuit every 18 months, as foreseen by Intel’s Gordon Moore in 1965—the so-called Moore’s law. Such advance, for instance, is the reason we were able to cram all computing power used in the Apollo 11 lunar landing capsule in 1969 into a single device by early 2010s. Similar leaps occurred for other components, spawning things like paper-thin foldable displays, or pinhead-sized devices that can store entire encyclopedias.

However, since such machines are Turing machines at its core, they are also bound by Turing machine’s limitations. One of such is their inability to tackle certain mathematical problems, the so-called NP-Hard problems. The most infamous of them is the Traveling Sales agent problem— calculating the shortest route through a series of cities and visiting each exactly once. Digital computers can calculate solutions for small setups, roughly by comparing all possible paths to each other. As problem size grows, mathematicians invented heuristic algorithms for finding reasonable solutions without going through all possibilities, but there is no certainty that the optimal path will be found.

As every NP-Hard problem is equivalent to the traveling sales agent, unlocking its solution would set in motion a whole new universe of possibilities, for many optimizations. This is the key held by quantum computers.

  • Annex III. Modern Cryptographic Algorithms and Their Vulnerabilities to Current Technologies

Today’s cryptography is based on three main types of algorithms: symmetric keys, asymmetric (public) keys and algorithmic hash functions, or hashing. Appendix IV lists the current and past main algorithms.

AES algorithm is currently the accepted standard for symmetric-key encryption. NIST selected it in 2001 to replace the former standard (Triple-DES). Although multiple publications introduced new cryptanalysis schemes attempting to undermine AES, the cryptographic community proved them ineffective. For example, Biryukov and others (2010) outlined an effective attack against specific variations of AES, which reduces the encryption strength. However, such attacks were deemed impractical and dismissed as a non-threat to AES encryption algorithms.

The RSA algorithm, a popular standard for asymmetric (public-key) encryption, is widely used to protect confidentiality and digital signature. The RSA algorithm has been resilient to cryptanalysis techniques since its publication in 1977, despite several attempts to challenge its strength. Earlier it was suggested that some knowledge of the plaintext message, under specific conditions, could weaken the encryption ( Durfee, 2002 ). However, RSA algorithms continue to be resilient. Although some schemes may be used to reduce time and memory required to break public-key encryption, so far it has been proven that adequate key sizes and best practices make public-key cryptography resilient to classical computer attacks. It would take billions of years for a digital computer to break the current standard RSA 2,048-bit key ( CISA, 2019 ).

Algorithmic hash functions were temporarily impacted by cryptanalysis, but recent progress restored their effectiveness. In 2005, the mathematician Lenstra demonstrated a hash-collision attack 17 against one of the most used hashing functions named MD5 ( Lenstra et. al, 2005 ). Other researchers later demonstrated that a decent desktop computer equipped with a cheap graphics processor (GPU) could find a hash-collision in less than a minute. MD5 algorithm was officially retired by NIST in 2011. However, it is still widely used despite its known weaknesses, demonstrating the long-lasting issue with replacing legacy systems. NIST ran a competition to create the next standard for the algorithmic hash function named SHA-3 to overcome the cryptanalysis advancement undermining MD5 and the earlier versions of the SHA algorithms. While there are some possible weaknesses, 18 SHA-3 was selected in 2015 and became the approved standard ( Morawiecki et. al, 2014 ). Furthermore, almost any cryptographic algorithm can be strengthened by increasing its key sizes, but that would require more processing power and thus increase the costs of running the algorithm, often making it prohibitively expensive.

Beyond the encryption algorithm itself, a different class of attacks studies the exogenous systems. Sid e-channel attacks target the software, firmware, and hardware used to implement the encryption algorithm. Software and hardware vulnerabilities are usually easier to find and exploit compared to breaking the underlying mathematical techniques of the encryption algorithm. Vulnerabilities, or bugs, are the result of implementation mistakes during the development phases. However, some vulnerabilities may be the result of misuse or misconfiguration of the cryptographic libraries. The Heartbleed vulnerability ( CMU, 2014 ) was a devastating example of a vulnerability discovered in OpenSSL, a widely used cryptographic library to secure network communication. ( Lazar et. al., 2014 ) reported that 17 percent of the vulnerabilities in cryptographic libraries published by CVE 19 between 2011 and 2014 were mistakes made during the development phases while the remaining 83 percent were related to misuse or misconfiguration by the hosting applications.

  • Annex IV. Main Cryptographic Algorithms

Allen , Bryce D. 2008 . “ Implementing several attacks on plain ElGamal encryption .”— Graduate Theses and Dissertations, 11535 . Mimeo available at https://lib.dr.iastate.edu/etd/11535 .

  • Search Google Scholar
  • Export Citation

Anschuetz , E. , Olson , J. , Aspuru-Guzik , A. and Cao , Y. 2019 . “ Variational quantum factoring ”. In International Workshop on Quantum Technology and Optimization Problems (pp. 74 – 85 ). Springer , Cham .

Arute , F. , Arya , K. , Babbush , R. et al. , 2019 . “ Quantum supremacy using a programmable superconducting processor .”— Nature 574 , 505 – 510 . https://www.nature.com/articles/s41586-019-1666-5#citeas .

Bertoni Guido , Joan Daemen , Michaël Peeters and Gilles Van Assche . 2007 . “ Sponge Functions.”— ECRYPT Hash Workshop 2007 , https://www.researchgate.net/publication/242285874_Sponge_Functions .

Biryukov Alex , Orr Dunkelman , Nathan Keller , Dmitry Khovratovich , Adi Shamir . 2010 . “ Key Recovery Attacks of Practical Complexity on AES-256 Variants with up to 10 Rounds .”— Advances in Cryptology – EUROCRYPT 2010 , pp 299 – 319 . https://link.springer.com/chapter/10.1007/978–3-642–13190-5_15 .

Boaron Alberto , Gianluca Boso , Davide Rusca , C´edric Vulliez , Claire Autebert , Misael Caloz , Matthieu Perrenoud , Gäetan Gras , Félix Bussières , Ming-Jun Li , Daniel Nolan , Anthony Martin , and Hugo Zbinden . 2018 . “ Secure quantum key distribution over 421 km of optical f iber.”—July 9, 2018 . Mimeo available at https://arxiv.org/pdf/1807.03222.pdf .

Bouland , Adam , Wim van Dam , Hamed Joorati , Iordanis Kerenidis , Anupam Prakash . 2020 . “ Prospects and Challenges of Quantum Finance ”: https://arxiv.org/pdf/2011.06492.pdf

Burr William and Kathy Lyons-Burke . 1999 . “ Public Key Infrastructures for the Financial Services Industry . Mimeo . National Institute of Standards and Technology .

Cade , Chris , Lana Mineh , Ashley Montanaro , and Stasja Stanisic . 2020 . Strategies for solving the Fermi-Hubbard model on near-term quantum computers . Physical Review B .

CISA . 2019 . “ Understanding Encryption.” —CISA, August 2019 . Mimeo available at https://www.nd.gov/itd/sites/itd/files/legacy/alliances/siec/CISA%20Encryption%2028AUG19.pdf

CMU . 2014 : “ OpenSSL TLS heartbeat extension read overflow discloses sensitive information .”—by CERT Coordination Center . Mimeo available at https://www.kb.cert.org/vuls/id/720951/ .

Diffie Whitfield and Martin Hellman . 1976 . “ New Directions in Cryptography .”— IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. IT-22, NO. 6, NOVEMBER 1976 . https://ee.stanford.edu/~hellman/publications/24.pdf .

De Feo Luca , David Jao , and Jerome Plut . 2011 . “ Towards Quantum-Resistant Cryptosystems from Supersingular Elliptic Curve Isogenies .”— Mimeo available at https://eprint.iacr.org/2011/506.pdf .

Dobraunig Christoph , Maria Eichlseder , and Florian Mendel . 2016 . “ Analysis of SHA-512/224 and SHA-512/256 .”— Advances in Cryptology—ASIACRYPT 2015 , pp 612 – 630 , https://link.springer.com/chapter/10.1007%2F978-3-662-48800-3_25 .

Dooley , J.F. 2018 . “ History of Cryptography and Cryptanalysis. Codes, Ciphers, and Their Algorithms ,”— Springer .

Glenn Durfee . 2002 . “ Cryptanalysis of RSA Using Algebraic and Lattice Methods .”— Stanford University . Mimeo available at http://theory.stanford.edu/~gdurf/durfee-thesis-phd.pdf .

EFF . 1998 . “ Cracking DES: Secrets of Encryption Research, Wiretap Politics, and Chip Design .”— The Electronic Frontier Foundation (EFF), distributed by O’Reilly & Associates, inc . https://archive.org/details/crackingdes00elec .

Egger D. J. et al . 2020 . “ Quantum Computing for Finance: State-of-the-Art and Future Prospects ,” in IEEE Transactions on Quantum Engineering , Vol. 1 , pp. 1 – 24 , 2020 , Art no. 3101724, doi: 10.1109/TQE.2020.3030314.

Macmillan . 1971 . “ The Born-Einstein Letters: Correspondence between Albert Einstein and Max and Hedwig Born from 1916–1955, with commentaries by Max Born .” — Macmillan , 1971.

El Gamal Taher . 1985 . “ A Public Key Cryptosystem and a Signature Scheme Based on Discrete Logarithms .”— IEEE Transactions on Information Theory , Volume: 31 , Issue: 4 , Jul 1985, https://ieeexplore.ieee.org/document/1057074 .

ETSI . 2015 : “ Quantum Safe Cryptography and Security. An introduction, benefits, enablers and challenges .”— European Telecommunications Standards Institute, ETSI White Paper No. 8, June 2015 . Mimeo available at https://www.etsi.org/images/files/ETSIWhitePapers/QuantumSafeWhitepaper.pdf .

ETSI . 2017 . “ Quantum-Safe Cryptography; Quantum-Safe threat assessment .”— European Telecommunications Standards Institute, group report, March 2017 . Mimeo available at https://www.etsi.org/deliver/etsi_gr/QSC/001_099/004/01.01.01_60/gr_QSC004v010101p.pdf .

ETSI . 2020 . “ CYBER; Migration strategies and recommendations to Quantum Safe schemes ”. Available at: https://www.etsi.org/deliver/etsi_tr/103600_103699/103619/01.01.01_60/tr_103619v010101_p.pdf

Ferguson , Niels . 1999 . “ Impossible differentials in Twofish .”— Twofish Technical Report #5, October 19, 1999 . Mimeo available at https://www.schneier.com/academic/paperfiles/paper-twofish-impossible.pdf .

Galbraith et. al . 2016 . Steven D. Galbraith , Christophe Petit , Barak Shani , and Yan Bo Ti , “ On the Security of Supersingular Isogeny Cryptosystems .”— Advances in Cryptology – ASIACRYPT 2016 , pp 63 – 91 , https://link.springer.com/chapter/10.1007%2F978-3-662-53887-6_3 .

Google Report . 2020 . “ HTTPS encryption on the web .” — Google Transparency Report . Mimeo available at https://transparencyreport.google.com/https/overview?hl=en .

Gramm-Leach-Bliley Act . 1999 . Financial Services Modernization Act of 1999 , https://www.ftc.gov/tips-advice/business-center/privacy-and-security/gramm-leach-bliley-act .

GDPR . General Data Protection Regulation , 2018 . https://gdpr-info.eu/ .

Gidney Craig and Martin Eker . 2019 . “ How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits.”—December 6, 2019 . Mimeo available at https://arxiv.org/pdf/1905.09749.pdf .

Grassl Markus , Brandon Langenberg , Martin Roetteler , and Rainer Steinwandt . 2015 . “ Applying Grover’s algorithm to AES: quantum resource estimates .” Mimeo available at https://arxiv.org/pdf/1512.04965.pdf .

Heninger , Nadia . 2015 . “ How Diffie-Hellman Fails in Practice.”—Presentation available at https://simons.berkeley.edu/talks/nadia-heninger-2015-07-07 .

Kothari , Robin . 2020 . “ Quantum speedups for unstructured problems: Solving two twenty-year-old problems ”. Microsoft Research Blog : https://www.microsoft.com/en-us/research/blog/quantum-speedups-for-unstructured-problems-solving-two-twenty-year-old-problems/

Johnson Don , Alfred Menezes and Scott Vansto . 2001 . “ The Elliptic Curve Digital Signature Algorithm (ECDSA) .”— Mimeo available at https://www.cs.miami.edu/home/burt/learning/Csc609.142/ecdsa-cert.pdf .

Lazar David , Haogang Chen , Xi Wang , and Nickolai Zeldovich . 2014 . “ Why does cryptographic software fail? A case study and open problems .”— MIT CSAIL . Mimeo available at https://people.csail.mit.edu/nickolai/papers/lazar-cryptobugs.pdf .

Lenstra Arjen , Xiaoyun Wang and Benne de Weger . 2005 . “ Cryptology ePrint Archive: Report 2005/067 .”— Mimeo available at https://eprint.iacr.org/2005/067 .

Martinis John , and Sergio Boixo . 2019 . “ Quantum Supremacy Using a Programmable Superconducting Processor .” Google AI Blog : https://ai.googleblog.com/2019/10/quantum-supremacy-using-programmable.html .

Morawiecki Pawe , Josef Pieprzyk , and Marian Srebrny . 2014 . “ Rotational Cryptanalysis of Round-reduced Keccak .”— Conference Paper, July 2914 . Mimeo available at https://www.researchgate.net/publication/267247045_Rotational_Cryptanalysis_of_Round-Reduced_Keccak .

Moriai Shiho , and Yiqun Lisa Yin , “ Cryptanalysis of Twofish (II). 1999 .”— Mimeo available at https://www.schneier.com/twofish-analysis-shiho.pdf .

Monz , T. , Nigg , D. , Martinez , E. A. , Brandl , M. F. , Schindler , P. , Rines , R. , Wang , S. X. , Chuang , I.L. and Blatt , R. 2016 : “ Realization of a scalable Shor algorithm ”. Science , 351 ( 6277 ), pp. 1068 – 1070 .

National Academies of Sciences . 2019 . “ Engineering, and Medicine: Quantum Computing: Progress and Prospects .” The National Academies Press , Washington, DC .

Nielsen , M.A. and Chuang , I.L. 2010 . “ Quantum Computation and Quantum Information .”— Cambridge University Press .

NIST . 2019 . “ Transitioning the Use of Cryptographic Algorithms and Key Lengths .”— NIST, March 21, 2019 . Mimeo available at https://csrc.nist.gov/News/2019/NIST-Publishes-SP-800-131A-Rev-2

NIST . 2020 . “ Status Report on the First Round of the NIST Post-Quantum Cryptography Standardization Process .” — National Institute of Standards and Technology Internal Report 8240, July 2020 . Mimeo available at https://csrc.nist.gov/publications/detail/nistir/8309/final

Orus Roman , Samuel Mugel , and Enrique Lizaso . 2019 . “ Quantum computing for finance: overview and prospects .” — Reviews in Physics, Volume 4, November 2019 . Mimeo available at https://doi.org/10.1016/j.revip.2019.100028 .

Pednault , Edwin , John A. Gunnels , Giacomo Nannicini , Lior Horesh , and Robert Wisnieff . 2019 . “ Leveraging Secondary Storage to Simulate Deep 54-qubit Sycamore Circuits .” Mimeo available at https://arxiv.org/abs/1910.09534 .

Shannon , C.E. . 1938 . “ A symbolic analysis of relay and switching circuits .” Electrical Engineering , 57 ( 12 ), pp. 713 – 723 .

Singh , S . 1999 . “ The Code Book: The Evolution of Secrecy from Mary, Queen of Scots to Quantum Cryptography ,”— Doubleday Books .

Stevens Marc , Elie Bursztein , Pierre Karpman , Ange Albertini , Yarik Markov . 2017 . “ The first collision for full SHA-1 .”— Cryptology ePrint Archive: Report 2017/190 . Mimeo available at https://eprint.iacr.org/2017/190 .

Tang Lynda , Nayoung Lee , Sophie Russo . 2018 . “ Breaking Enigma .”. Mimeo available at https://www.semanticscholar.org/paper/Breaking-Enigma-Tang-Lee/692ea1d3eee5f423639d36f495bc6c7f7614806c .

Zhong , Han-Sen , Hui Wang , Yu-Hao Deng , Min g-Cheng Chen et al . 2020 . “ Quantum computational advantage using photons .“— Science, December 3, 2020 .

We would like to thank, without implications, Andreas Bauer, Sonja Davidovic, Davide Furceri, Dong He, and Herve Tourpe for their helpful comments and suggestions on earlier versions of the paper; and Mariam Souleyman for excellent administrative and editorial assistance.

Macmillan (1971 , p. 158).

In the literature on quantum computing, computers that process information according to classical laws of physics are referred to as classical computers, as opposed to quantum computers. In this paper, we use the terms classical, conventional, digital, and traditional computers interchangeably.

These risks are known as “harvest now, decrypt later” attacks.

“ Uncertainty principals: Commercialising quantum computers. ”—The Economist, September 26, 2020.

“ Cramming More Power Into a Quantum Device .”— IBM research blog, March 4, 2019.

While the final objective is to build fully error-corrected quantum computers, an intermediate objective is to build practical commercial applications of noisy intermediate-scale quantum (NISQ) computers. Currently noise is present in both quantum annealers and NISQ types of machine, limiting the complexity of the problems that they can solve.

China Reaches New Milestone in Space-Based Quantum Communications ,—Scientific American, June 25, 2020.

China has developed the world’s first mobile quantum satellite station , NewScientist, January 10, 2020.

Unhackable internet , MIT Technology Review, April 2, 2020.

Researchers achieve sustained, high-fidelity quantum teleportation , Phys.org, December 29, 2020.

https://www.darpa.mil/program/quantum-assisted-sensing-and-readout .

Cryptanalysis , the analysis of the encrypted secret message ( ciphertext ) to gain as much information as possible about the original message, studies the algorithms, mathematics and techniques to uncover the secret messages. By exploiting weaknesses in the underlying encryption methods, much can be learned about the original message without knowing the secret key (see Annex III ).

“ Thousands of Organizations Run the Majority of their Computers on Outdated Operating Systems, Nearly Tripling Chances of a Data Breach .” —BitSight.

https://newsroom.ibm.com/2020-11-30-IBM-Cloud-Delivers-Quantum-Safe-Cryptography-and-Hyper-Protect-Crypto-Services-to-Help-Protect-Data-in-the-Hybrid-Era .

Instead of ‘supremacy’ use ‘quantum advantage’ : Nature, December 10, 2019.

In a hash collision attack, an attacker attempts to find two inputs to the hash algorithm that would produce the same hash value. When such a collision is found, the algorithmic hash functions is deemed insecure.

They described a preimage attack based on rotational cryptanalysis that reduces the algorithm rounds against SHA-3 512 bit variation. As a result, less time and memory would be required to find a hash-collision.

The Common Vulnerabilities and Exposures (CVE) is an international cybersecurity community effort to maintain a list of common identifiers for publicly known cybersecurity vulnerabilities.

Same Series

  • A Survey of Research on Retail Central Bank Digital Currency
  • Fintech, Inclusive Growth and Cyber Risks: Focus on the MENAP and CCA Regions
  • Fintech in Latin America and the Caribbean: Stocktaking
  • Accounting for Cloud Computing in the National Accounts
  • Financial Intermediation and Technology: What's Old, What's New?
  • Beyond the COVID-19 Crisis: A Framework for Sustainable Government-To-Person Mobile Money Transfers
  • China's Digital Economy: Opportunities and Risks
  • Harnessing Digital Technologies to Promote SMEs in the MENAP Region
  • Fintech and Payments Regulation: Analytical Framework
  • Oversight Issues in Mobile Payments

Other IMF Content

  • Quantum Computing’s Possibilities and Perils
  • Singapore: Financial Sector Assessment Program-Technical Note-Fintech: Implications for the Regulation and Supervision of the Financial Sector
  • Fintech and Financial Services: Initial Considerations
  • People’s Republic of China–Hong Kong Special Administrative Region: Financial Sector Assessment Program-Technical Note-Implications of Fintech for the Regulation and Supervision of the Financial Sector
  • Fintech Payments in Public Financial Management: Benefits and Risks
  • Ireland: Financial Sector Assessment Program-Technical Note on Oversight of Fintech
  • Central Bank Risk Management, Fintech, and Cybersecurity
  • Crypto Assets and CBDCs in Latin America and the Caribbean: Opportunities and Risks
  • Fintech: The Experience So Far
  • Chapter 7. Instilling Digital Trust

Other Publishers

Asian development bank.

  • The Role of Central Bank Digital Currencies in Financial Inclusion: Asia-Pacific Financial Inclusion Forum 2022
  • Fintech Policy Tool Kit for Regulators and Policy Makers in Asia and the Pacific
  • Cloud Computing as a Key Enabler for Digital Government across Asia and the Pacific
  • Cloud Computing as a Key Enabler for Tech Start-Ups across Asia and the Pacific
  • Cloud Audit Toolkit for Financial Regulators
  • Recent Technological Advances in Financial Market Infrastructure in ASEAN+3: Cross-Border Settlement Infrastructure Forum
  • Digital Technologies for Climate Action, Disaster Resilience, and Environmental Sustainability
  • Central Bank Digital Currencies: A Potential Response to the Financial Inclusion Challenges of the Pacific
  • Leveraging Technology and Innovation for Disaster Risk Management and Financing
  • Building Regulatory and Supervisory Technology Ecosystems: For Asia's Financial Stability and Sustainable Development

Inter-American Development Bank

  • Quantum Technologies: Digital Transformation, Social Impact, and Cross-sector Disruption
  • Cloud Computing: Opportunities and Challenges for Sustainable Economic Development in Latin America and the Caribbean
  • Technologies for Education (TEd) - A Framework for Action
  • The Distance between Perception and Reality in the Social Domains of Life
  • Financial Services in the Trading System: Progress and Prospects
  • Wholesale Payments Systems and Financial Discipline, Efficiency, and Liquidity
  • Distance Teaching Strategies to Reduce the Dropout Rate and Improve Learning in Secondary Education
  • Regulatory Sandboxes in Latin America and the Caribbean for the FinTech Ecosystem and the Financial System
  • South Korea's Experience with Smart Infrastructure Services: Bus Management Systems
  • Educational Software in the One-to-One Computing Programs

Nordic Council of Ministers

  • Nordic Public Sector Cloud Computing - a discussion paper

The World Bank

  • Distributed Ledger Technology and Secured Transactions: Note 1. Collateral Registry, Secured Transactions Law and Practice
  • Ukraine Strategy for Financial Services Consumer Protection and Financial Literacy (2012-17): Diagnostic Review and Action Plan, Volume 1. Main Findings and Recommendations.
  • Distributed Ledger Technology and Secured Transactions: Note 3. Distributed Ledger Technology and Secured Transactions Framework
  • Kazakhstan Diagnostic Review of Consumer Protection in Financial Services: Volume 1. Key Findings and Recommendations.
  • Mozambique Diagnostic Review of Consumer Protection and Financial Literacy: Volume 1. Key Findings and Recommendations.
  • Ghana Digital Economy Diagnostic
  • Learning to Navigate a New Financial Technology: Evidence from Payroll Accounts
  • Distributed Ledger Technology and Secured Transactions: Note 2. Regulatory Implications of Integrating Digital Assets and Distributed Ledgers in Credit Ecosystems
  • The use and misuse of computers in education: evidence from a randomized experiment in Colombia
  • Ukraine Strategy for Financial Services Consumer Protection and Financial Literacy (2012-17): Diagnostic Review and Action Plan, Volume 2. Assessment Against Good Practices.

Cover IMF Working Papers

Table of Contents

  • Front Matter
  • Quantum Computing and the Financial System: Spooky Action at a Distance?
  • View raw image
  • Download Powerpoint Slide

quantum computer spooky action at a distance

International Monetary Fund Copyright © 2010-2021. All Rights Reserved.

quantum computer spooky action at a distance

  • [66.249.64.20|185.66.14.236]
  • 185.66.14.236

Character limit 500 /500

Quantum Positioned

Quantum Positioned

Spooky Action at a Distance: The Spine-Chilling Science of Quantum Entanglement

October 30, 2023

Spooky Action at a Distance: The Spine-Chilling Science of Quantum Entanglement

Quantum entanglement is a totally spooky phenomenon in quantum physics. The idea was first dreamed up in 1935 by Einstein, Podolsky and Rosen as a thought experiment trying to prove that quantum mechanics wasn’t a fully baked theory yet. But it turned out to be very real – and that was experimentally shown in 1972 using something called Bell’s Inequality.

So what is quantum entanglement? It’s when two or more particles get so closely linked together that their individual quantum states are tied up with each other. It’s like they become one connected system, even if the particles are far apart in space. So if you measure one entangled particle, the properties of its partner particle snap into correlation instantly, faster than the speed of light.

It’s utterly bizarre – the particles act interconnected as if they’re not separate anymore. Experiments have proven this spooky action-at-a-distance is real, and it reveals something deep and non-intuitive about our quantum reality. Quantum entanglement shows that physics at the atomic scale is full of radical phenomena that clash with our everyday assumptions about causality and connections in space and time.

Correlation of Entanglement

So when particles get entangled, they end up having this bizarre, instant connection that totally defies common sense. If you measure one entangled particle, you instantly know something about the other one – even if they’re light years apart! It’s called correlation due to entanglement.

Basically, entangled particles are like synchronized dancers – if one particle spins clockwise, its entangled partner will instantly spin counterclockwise. Spookily, this happens immediately, no matter how far apart they are. Einstein called it “spooky action at a distance” and thought it was too weird to be real – but experiments have proven that it’s a real effect.

This instant correlation only happens with entangled particles – there’s no actual physical force connecting them after they split up. It’s like they share a quantum state rather than being separate individual particles. This entanglement correlation is completely unique to the quantum realm . In our everyday world, two separated objects can’t instantly affect each other without a force acting between them. But at tiny quantum scales, the world plays by different rules. Particles can remain mysteriously connected across space and time in a way that defies explanation.

Non-locality

Non-locality is a concept in quantum physics that refers to the phenomenon where two or more entangled particles exhibit instantaneous correlations in their measurements, regardless of the distance between them. This means that when one particle’s quantum state is measured and a property is determined, the corresponding property of the other entangled particle is immediately known, even if it is far away and the two particles are spatially separated, seemingly violating the speed-of-light limit for information transfer.

Bell’s Theorem

Bell’s Theorem is a huge deal in quantum physics. It was thought up in the 1960s by a physicist named John Bell. He wanted to figure out if the freaky quantum correlations between entangled particles could be explained by old-school classical physics, or if we needed something totally new – a non-classical, quantum explanation.

Basically, Bell came up with this mathematical inequality, now called Bell’s inequality . He showed that if quantum particles like photons operated under classical physics, they would obey this inequality when measured. But if the quantum theory was correct, certain measurements on entangled particles would violate Bell’s inequality.

This was big news because it meant scientists could design experiments to test Bell’s theorem . Over the years, physicists have run all kinds of tests on entangled particles. And you know what? The results have consistently shown that entangled particles violate Bell’s inequality, just as quantum theory predicts.

So those inexplicable quantum correlations really can’t be explained by classical physics – we definitely need quantum mechanics to account for it. This was groundbreaking because it showed that the quantum realm is utterly different from the world we see around us every day. At the quantum level, reality is nonlocal and things are interconnected in mind-bending ways that classical physics just can’t grasp.

Philosophically, this shakes the foundations of how we think about reality, causality, and connections in space and time. Practically, it also allows new quantum technologies like unhackable cryptography and quantum computing. But at its core, Bell’s theorem definitively proved that to make sense of the tiny quantum world, we need an entirely new set of rules – the strange but powerful rules of quantum mechanics.

Future Applications of Quantum Entanglement

Quantum entanglement, a captivating phenomenon in the realm of quantum physics, has transcended its enigmatic nature to find practical applications across a spectrum of scientific and technological domains. Despite its initial paradoxical appearance, entanglement has been harnessed to yield substantial benefits in diverse fields.

One notable application of entanglement is in quantum cryptography, where it underpins the principles of quantum key distribution (QKD) . QKD protocols, exemplified by the renowned BB84 protocol , facilitate secure communication between parties by enabling the establishment of an impervious shared secret key. The remarkable attribute of entanglement ensures that any attempt to intercept the key would disrupt the entangled state, alerting communicators to potential eavesdropping.

Entanglement’s role extends to the tantalizing concept of quantum teleportation . Leveraging entanglement, quantum teleportation allows the quantum state of one particle to be instantaneously transferred to another, even if separated by significant distances. This groundbreaking phenomenon holds promise for quantum communication and the prospective development of quantum internet infrastructure.

In essence, quantum entanglement, with its seemingly paradoxical properties, has evolved from a perplexing enigma to a wellspring of practical applications. Its utilization spans quantum cryptography, computing, communication, imaging, sensing, and fundamental research, holding the potential to reshape the landscape of modern science and technology.

Quantum entanglement has been experimentally verified through various tests, including Bell inequality tests, which have consistently shown correlations that cannot be explained by classical physics. While entanglement has been widely studied and demonstrated in laboratories, it remains one of the most intriguing and counterintuitive aspects of quantum mechanics.

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

most recent

Quantum Computers Inch Closer to Viability with New 'Error Eraser'

News , Research and Development

Quantum computers inch closer to viability with new ‘error eraser’.

Quantum Computing 2023 Wrapped: what a year

News , Research and Development , World and Politics

Quantum computing 2023 wrapped: a year to remeber.

What Are The Top 10 Best Quantum Computing Conferences Worldwide?

Quantum Computing , News , Understand Quantum Computing

What are the top 10 best quantum computing conferences worldwide.

How Quantum Computing is Poised to Combat Climate Change

Understand Quantum Computing , Quantum Computing

How quantum computing is poised to combat climate change.

Harnessing the Quantum Realm –Engineering with Mechanical Modes

Understand Quantum Computing , Quantum Mechanics

Demystifying the quantum world – insights from theoretical models.

The Quantum Harmonic Oscillator – An Oscillating Tale of Quantized Energy Levels

The Quantum Harmonic Oscillator – An Oscillating Tale of Quantized Energy Levels

Email: [email protected]

  • Newsletters

Quantum computing is taking on its biggest challenge: noise

For a while researchers thought they’d have to make do with noisy, error-prone systems, at least in the near term. That’s starting to change.

  • Michael Brooks archive page

Jay Gambetta at IBM

In the past 20 years, hundreds of companies, including giants like Google, Microsoft, and IBM, have staked a claim in the rush to establish quantum computing. Investors have put in well over $5 billion so far. All this effort has just one purpose: creating the world’s next big thing. 

Quantum computers use the counterintuitive rules that govern matter at the atomic and subatomic level to process information in ways that are impossible with conventional, or “classical,” computers. Experts suspect that this technology will be able to make an impact in fields as disparate as drug discovery, cryptography, finance, and supply-chain logistics.

The promise is certainly there, but so is the hype. In 2022, for instance, Haim Israel, managing director of research at Bank of America, declared that quantum computing will be “bigger than fire and bigger than all the revolutions that humanity has seen.” Even among scientists, a slew of claims and vicious counterclaims have made it a hard field to assess.

Ultimately, though, assessing our progress in building useful quantum computers comes down to one central factor: whether we can handle the noise. The delicate nature of quantum systems makes them extremely vulnerable to the slightest disturbance, whether that’s a stray photon created by heat, a random signal from the surrounding electronics, or a physical vibration. This noise wreaks havoc, generating errors or even stopping a quantum computation in its tracks. It doesn’t matter how big your processor is, or what the killer applications might turn out to be: unless noise can be tamed, a quantum computer will never surpass what a classical computer can do. 

For many years, researchers thought they might just have to make do with noisy circuitry, at least in the near term—and many hunted for applications that might do something useful with that limited capacity. The hunt hasn’t gone particularly well, but that may not matter now. In the last couple of years, theoretical and experimental breakthroughs have enabled researchers to declare that the problem of noise might finally be on the ropes. A combination of hardware and software strategies is showing promise for suppressing, mitigating, and cleaning up quantum errors. It’s not an especially elegant approach, but it does look as if it could work—and sooner than anyone expected.

“I’m seeing much more evidence being presented in defense of optimism,” says Earl Campbell, vice president of quantum science at Riverlane, a quantum computing company based in Cambridge, UK. 

Even the hard-line skeptics are being won over. University of Helsinki professor Sabrina Maniscalco, for example, researches the impact of noise on computations. A decade ago, she says, she was writing quantum computing off. “I thought there were really fundamental issues. I had no certainty that there would be a way out,” she says. Now, though, she is working on using quantum systems to design improved versions of light-activated cancer drugs that are effective at lower concentrations and can be activated by a less harmful form of light. She thinks the project is just two and a half years from success. For Maniscalco, the era of “quantum utility”—the point at which, for certain tasks, it makes sense to use a quantum rather than a classical processor—is almost upon us. “I’m actually quite confident about the fact that we will be entering the quantum utility era very soon,” she says. 

Putting qubits in the cloud

This breakthrough moment comes after more than a decade of creeping disappointment. Throughout the late 2000s and the early 2010s, researchers building and running real-world quantum computers found them to be far more problematic than the theorists had hoped. 

To some people, these problems seemed insurmountable. But others, like Jay Gambetta, were unfazed. 

A quiet-spoken Australian, Gambetta has a PhD in physics from Griffith University, on Australia’s Gold Coast. He chose to go there in part because it allowed him to feed his surfing addiction. But in July 2004, he wrenched himself away and skipped off to the Northern Hemisphere to do research at Yale University on the quantum properties of light. Three years later (by which time he was an ex-surfer thanks to the chilly waters around New Haven), Gambetta moved even further north, to the University of Waterloo in Ontario, Canada. Then he learned that IBM wanted to get a little more hands-on with quantum computing. In 2011, Gambetta became one of the company’s new hires. 

Quantum Chandelier at IBM in Yorktown Heights, NY

IBM’s quantum engineers had been busy building quantum versions of the classical computer’s binary digit, or bit. In classical computers, the bit is an electronic switch, with two states to represent 0 and 1 . In quantum computers, things are less black and white. If isolated from noise, a quantum bit, or “qubit,” can exist in a probabilistic combination of those two possible states, a bit like a coin in mid-toss. This property of qubits, along with their potential to be “entangled” with other qubits, is the key to the revolutionary possibilities of quantum computing.

A year after joining the company, Gambetta spotted a problem with IBM’s qubits: everyone could see that they were getting pretty good. Whenever he met up with his fellow physicists at conferences, they would ask him to test out their latest ideas on IBM’s qubits. Within a couple of years, Gambetta had begun to balk at the volume of requests. “I started thinking that this was insane—why should we just run experiments for physicists?” he recalls. 

“We watched the first jobs come in. We could see them pinging on the quantum computer. When it didn’t break, we started to relax.” Jay Gambetta

It occurred to him that his life might be easier if he could find a way for physicists to operate IBM’s qubits for themselves—maybe via cloud computing. He mentioned it to his boss, and then he found himself with five minutes to pitch the idea to IBM’s executives at a gathering in late 2014. The only question they asked was whether Gambetta was sure he could pull it off. “I said yes,” he says. “I thought, how hard can it be?”

Very hard, it turned out, because IBM’s executives told Gambetta he had to get it done quickly. “I wanted to spend two years doing it,” he says. They gave him a year.

It was a daunting challenge: he barely knew what the cloud was back then. Fortunately, some of his colleagues did, and they were able to upgrade the team’s remote access protocols—useful for tweaking the machine in the evening or on the weekend—to create a suite of interfaces that could be accessed from anywhere in the world. The world’s first cloud-access quantum computer, built using five qubits, went live at midnight on May the 4th, 2016. The date, Star Wars Day, was chosen by nerds, for nerds. “I don’t think anyone in upper management was aware of that,” Gambetta says, laughing.

Not that upper management’s reaction to the launch date was uppermost in his mind. Of far more concern, he says, was whether a system reflecting years of behind-the-scenes development work would survive being hooked up to the real world. “We watched the first jobs come in. We could see them pinging on the quantum computer,” he says. “When it didn’t break, we started to relax.”

Cloud-based quantum computing was an instant hit. Seven thousand people signed up in the first week, and there were 22,000 registered users by the end of the month. Their ventures made it clear, however, that quantum computing had a big problem.

The field’s eventual aim is to have hundreds of thousands, if not millions, of qubits working together. But when it became possible for researchers to test out quantum computers with just a few qubits working together, many theory-based assumptions about how much noise they would generate turned out to be seriously off. 

Some noise was always in the cards. Because they operate at temperatures above absolute zero, where thermal radiation is always present, everyone expected some random knocks to the qubits. But there were nonrandom knocks too. Changing temperatures in the control electronics created noise. Applying pulses of energy to put the qubits in the right states created noise. And worst of all, it turned out that sending a control signal to one qubit created noise in other, nearby qubits. “You’re manipulating a qubit and another one over there feels it,” says Michael Biercuk, director of the Quantum Control Laboratory at the University of Sydney in Australia. 

By the time quantum algorithms were running on a dozen or so qubits, the performance was consistently shocking. In a 2022 assessment, Biercuk and others calculated the probability that an algorithm would run successfully before noise destroyed the information held in the qubits and forced the computation off track. If an algorithm with a known correct answer was run 30,000 times, say, the correct answer might be returned only three times.

Though disappointing, it was also educational. “People learned a lot about these machines by actually using them,” Biercuk says. “We found a lot of stuff that more or less nobody knew about—or they knew and had no idea what to do about it.” 

Fixing the errors

Once they had recovered from this noisy slap, researchers began to rally. And they have now come up with a set of solutions that can work together to bring the noise under control.

Broadly speaking, solutions can be classed into three categories. The base layer is error suppression. This works through classical software and machine-learning algorithms, which continually analyze the behavior of the circuits and the qubits and then reconfigure the circuit design and the way instructions are given so that the information held in the qubits is better protected. This is one of the things that Biercuk’s company, Q-CTRL, works on; suppression, the company says, can make quantum algorithms 1,000 times more likely to produce a correct answer. 

The next layer, error mitigation, uses the fact that not all errors cause a computation to fail; many of them will just steer the computation off track. By looking at the errors that noise creates in a particular system running a particular algorithm, researchers can apply a kind of “anti-noise” to the quantum circuit to reduce the chances of errors during the computation and in the output. This technique, something akin to the operation of noise-­canceling headphones, is not a perfect fix. It relies, for instance, on running the algorithm multiple times, which increases the cost of operation, and the algorithm only estimates the noise. Nonetheless, it does a decent job of reducing errors in the final output, Gambetta says. 

Helsinki-based Algorithmiq, where Maniscalco is CEO, has  its own way of cleaning up noise after the computation is done. “It basically eliminates the noise in post-­processing, like cleaning up the mess from the quantum computer,” Maniscalco says. So far, it seems to work at reasonably large scales. 

On top of all that, there has been a growing roster of achievements in “quantum error correction,” or QEC. Instead of holding a qubit’s worth of information in one qubit, QEC encodes it in the quantum states of a set of qubits. A noise-induced error in any one of those is not as catastrophic as it would be if the information were held by a single qubit: by monitoring each of the additional qubits, it’s possible to detect any change and correct it before the information becomes unusable. 

Implementing QEC has long been considered one of the essential steps on the path to large-scale, noise-tolerant quantum computing—to machines that can achieve all the promise of the technology, such as the ability to crack popular encryption schemes. The trouble is, QEC uses a lot of overhead. The gold-standard error correction architecture, known as a surface code, requires at least 13 physical qubits to protect a single useful “logical” qubit. As you connect logical qubits together, that number balloons: a useful processor might require 1,000 physical qubits for every logical qubit.

There are now multiple reasons to be optimistic even about this, however. In July 2022, for instance, Google’s researchers published a demonstration of a surface code in action where performance got better—not worse—when more qubits were connected together. 

That so many noise-handling techniques are flourishing is a huge deal—especially at a time when the notion that we might get something useful out of small-scale, noisy processors has turned out to be a bust.

There have also been promising demonstrations of theoretical alternatives to surface codes. In August 2023, an IBM team that included Gambetta showed an error correction technique that could control the errors in a 12-qubit memory circuit using an extra 276 qubits, a big improvement over the thousands of extra qubits required by surface codes. 

In September, two other teams demonstrated similar improvements with a fault-tolerant circuit called a CCZ gate, using superconducting circuitry and ion-trap processors .

That so many noise-handling techniques are flourishing is a huge deal—especially at a time when the notion that we might get something useful out of small-scale, noisy processors has turned out to be a bust. 

Actual error correction is not yet happening on commercially available quantum processors (and is not generally implementable as a real-time process during computations). But Biercuk sees quantum computing as finally hitting its stride. “I think we’re well on the way now,” he says. “I don’t see any fundamental issues at all.”

And these innovations are happening alongside general improvements in hardware performance—meaning that there are ever fewer baseline errors in the functioning qubits—and an increase in the number of qubits on each processor, making bigger and more useful calculations possible. Biercuk says he is starting to see places where he might soon choose a quantum computer over the best-­performing classical machines. Neither a classical nor a quantum computer can fully solve large-scale tasks like finding the optimal routes for a nationwide fleet of delivery trucks. But, Biercuk points out, accessing and running the best classical supercomputers costs a great deal of money—potentially more than accessing and running a quantum computer that might even give a slightly better solution.

“Look at what high-performance computing centers are doing on a daily basis,” says Kuan Tan, CTO and cofounder of the Finland-based quantum computer provider IQM. “They’re running power-hungry scientific calculations that are reachable [by] quantum computers that will consume much less power.” A quantum computer doesn’t have to be a better computer than any other kind of machine to attract paying customers, Tan says. It just has to be comparable in performance and cheaper to run. He expects we’ll achieve that quantum energy advantage in the next three to five years. 

Finding utility

A debate has long raged about what target quantum computing researchers should be aiming for in their bid to compete with classical computers. Quantum supremacy, the goal Google has pursued—a demonstration that a quantum computer can solve a problem no classical computer can crack in a reasonable amount of time? Or quantum advantage—superior performance when it comes to a useful problem—as IBM has preferred? Or quantum utility , IBM’s newest buzzword? The semantics reflect differing views of what near-term objectives are important. 

In June, IBM announced that it would begin retiring its entry-level processors from the cloud, so that its 127-qubit Eagle processor would be the smallest one that the company would make available. The move is aimed at pushing researchers to prioritize truly useful tasks. Eagle is a “utility-scale” processor, IBM says—when correctly handled, it can “provide useful results to problems that challenge the best scalable classical methods.” 

It’s a controversial claim—many doubt that Eagle really is capable of outperforming suitably prepared classical machines. But classical computers are already struggling to keep up with it, and IBM has even larger systems: the 433-qubit Osprey processor, which is also cloud-accessible, and the 1,121-qubit Condor processor, which debuted in December . (Gambetta has a simple rationale for the way he names IBM’s quantum processors: “I like birds.”) The company has a new modular design, called Heron, and Flamingo is slated to appear in 2025—with fully quantum connections between chips that allow the quantum information to flow between different processors unhindered, enabling truly large-scale quantum computation. That will make 2025 the first year that quantum computing will be provably scalable, Gambetta says: “I’m aiming for 2025 to be an important year for demonstrating key technologies that allow us to scale to hundreds of thousands of qubits.”

IQM’s Tan is astonished at the pace of development. “It’s mind-boggling how fast this field is progressing,” he says. “When I was working in this field 10 years ago, I would never have expected to have a 10-qubit chip at this point. Now we’re talking about hundreds already, and potentially thousands in the coming years.” 

It’s not just IBM. Campbell has been impressed by Google’s quiet but emphatic progress, for instance. “They operate differently, but they have hit the milestones on their public road map,” he says. “They seem to be doing what they say they will do.” Other household-name companies are embracing quantum computing too. “We’re seeing Intel using their top-line machines, the ones that they use for making chips, to make quantum devices,” Tan says. Intel is following a technology path very different from IBM’s: creating qubits in silicon devices that the company knows how to manufacture at scale, with minimal noise-inducing defects. 

finger points at Heron chip

As quantum computing hits its stride and quantum computers begin to process real-world data, technological and geographical diversity will be important to avoid geopolitical issues and problems with data-sharing regulations. 

There are restrictions, for instance, aimed at maintaining national security—which will perhaps limit the market opportunities of multinational giants such as IBM and Google. At the beginning of 2022, France’s defense minister declared quantum technologies to be of “strategic interest” while announcing a new national program of research. In July 2023, Deutsche Telekom announced a new partnership with IQM for cloud-based access to quantum computing, calling it a way for DT customers to access a “truly sovereign quantum environment, built and managed from within Europe.” 

This is not just nationalistic bluster: sovereignty matters. DT is leading the European Commission’s development of a quantum-­based, EU-wide high-security communications infrastructure; as the era approaches when large-scale quantum computers pose a serious threat to standard encryption protocols , governments and commercial organizations will want to be able to test “post-quantum” encryption algorithms—ones that withstand attack by any quantum computer, irrespective of its size—within their own borders. 

Not that this is a problem yet. Few people think that a security-destroying large-scale quantum processor is just around the corner. But there is certainly a growing belief in the field’s potential to be transformative—and useful—in other ways within just a few years. And these days, that belief is based on real-world achievements. “At Algorithmiq, we believe in a future where quantum utility will happen soon, but I can trace this optimism back to patents and publications,” Maniscalco says. 

The only downside for her is that not everybody has come around the way she has. Quantum computing is here now, she insists—but the old objections die hard, and many people refuse to see it. 

“There is still a lot of misunderstanding: I get very upset when I see or hear certain conversations,” she says. “Sometimes I wish I had a magic wand that could open people’s eyes.” 

2023 Global Cloud Ecosystem

A global survey finds cloud investment becoming a mainstay for technology development and operational efficiency

  • MIT Technology Review Insights archive page

Human brain cells hooked up to a chip can do speech recognition

Clusters of brain cells grown in the lab have shown potential as a new type of hybrid bio-computer.

  • Abdullahi Tsanni archive page

We need a moonshot for computing

The US government aims to push microelectronics research forward. But maintaining competitiveness in the long term will require embracing uncertainty.

  • Brady Helwig archive page
  • PJ Maykish archive page

Huawei’s 5G chip breakthrough needs a reality check

A self-made chip puts Huawei back in the smartphone game, but US sanctions are still hurting the company.

  • Zeyi Yang archive page

Stay connected

Get the latest updates from mit technology review.

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at [email protected] with a list of newsletters you’d like to receive.

IMAGES

  1. Quantum Computing

    quantum computer spooky action at a distance

  2. Quantum Entanglement (Spooky Action at a Distance)

    quantum computer spooky action at a distance

  3. Quantum experiment verifies Einstein's 'spooky action at a distance'

    quantum computer spooky action at a distance

  4. Quantum “spooky action at a distance” becoming practical

    quantum computer spooky action at a distance

  5. Spooky action at a distance: a haunted quantum world

    quantum computer spooky action at a distance

  6. What Is Quantum Entanglement? A Physicist Explains Einstein's “Spooky

    quantum computer spooky action at a distance

VIDEO

  1. Quantum Computer Terrifies Scientists After Revealed Something Insane

  2. Quantum Computer Terrifies Scientists After Something Insane Is Happening

  3. Michio Kaku Breaks Silence: "Quantum Computer Just Shut Down After It Found This"

  4. Quantum Entanglement A Simple Explanation

  5. Quantum Entanglement: Spooky action at a distance #interstellar #mystery #physics #space #shorts

  6. Quantum entanglements spooky action

COMMENTS

  1. Quantum Computing and the Financial System: Spooky Action at a Distance?

    The era of quantum computing is about to begin, with profound implications for the global economy and the financial system. Rapid development of quantum computing brings both benefits and risks. Quantum computers can revolutionize industries and fields that require significant computing power, including modeling financial markets, designing new effective medicines and vaccines, and empowering ...

  2. What is quantum entanglement? A physicist explains the science of

    A physicist explains the science of Einstein's 'spooky action at a distance' ... in 1935 that prompted Einstein to describe quantum entanglement as 'spooky action at a distance.' ...

  3. What is quantum entanglement? A physicist explains Einstein's 'spooky

    A physicist explains Einstein's 'spooky action at a distance' ... with quantum entanglement in 1935 that prompted Einstein to describe quantum entanglement as 'spooky action at a distance

  4. How Bell's Theorem Proved 'Spooky Action at a Distance' Is Real

    The "spooky action" that bothered Einstein involves a quantum phenomenon known as entanglement, in which two particles that we would normally think of as distinct entities lose their independence. Famously, in quantum mechanics a particle's location, polarization and other properties can be indefinite until the moment they are measured.

  5. Quantum 'spooky action at a distance' lands scientists Nobel prize in

    "Spooky action at a distance" — their results suggested — could, in fact, be real. RELATED STORIES — World's 1st multinode quantum network is a breakthrough for the quantum internet

  6. Quantum Computing and the Financial System: Spooky Action at a Distance

    famously derided entanglement as "spooky action at a distance" (Macmillan, 1971). By entangling qubits, the number of represented states rises exponentially, making it possible to ... quantum computer would require more bits than all atoms of planet earth, and a 280 -qubits

  7. China Shatters "Spooky Action at a Distance" Record, Preps for Quantum

    Einstein famously derided as "spooky action at a distance" one of the most bizarre elements of quantum theory—the way that measuring one member of an entangled pair of particles seems to ...

  8. Spooky action at a global distance: analysis of space-based

    With several metropolitan-scale QKD systems already in place 8,9,10,11,12,13,14,15, and with the development of quantum computers proceeding at a steady pace 16,17,18, the time is right to begin ...

  9. Physics Nobel recognizes Berkeley experiment on 'spooky action at a

    In 1971, graduate student Stuart Freedman and postdoctoral fellow John Clauser took over a room in the sub-basement of Birge Hall at the University of California, Berkeley, and built an experiment that would put to the test one of the most enduring weirdnesses of quantum mechanics, what Einstein called "spooky action at a distance."

  10. Einstein's 'spooky action at a distance' spotted in objects ...

    Albert Einstein colorfully dismissed quantum entanglement—the ability of separated objects to share a condition or state—as "spooky action at a distance." Over the past few decades, however, physicists have demonstrated the reality of spooky action over ever greater distances—even from Earth to a satellite in space. But the entangled ...

  11. 'Spooky action at a distance' can lead to a multiverse. Here's how

    A computer illustration of the creation of ... Physicists call this process quantum entanglement — what Albert Einstein referred to as "spooky action at a distance." ... In quantum mechanics, we ...

  12. Action at a distance

    "Spooky action at a distance" Einstein wrote to Max Born about issues in quantum mechanics in 1947 and used a phrase translated as "spooky action at a distance". The phrase has been picked up and used as a description for the cause of small non-classical correlations between physically separated measurement of entangled quantum states .

  13. Testing the speed of "spooky action at a distance" in a tabletop

    From 2000 on, several tests to set lower bounds of the Spooky action at a distance velocity (\(c \beta _{t,max}\)) have been performed. They are usually based on a Bell Test performed in km long ...

  14. Quantum 'spooky action at a distance' becoming practical

    A team from Griffith's Centre for Quantum Dynamics in Australia have demonstrated how to rigorously test if pairs of photons - particles of light - display Einstein's "spooky action at a distance ...

  15. Spooky Action Is Real: Bizarre Quantum Entanglement Confirmed in New

    Then, in the 1960s, physicist John Stewart Bell proposed a straightforward test, known as Bell's Inequality, to test spooky action at a distance. If spooky action were real, Bell proposed, then ...

  16. Quantum Computing Compact: Spooky Action at a Distance and

    This book is aimed at readers who want to understand quantum computing in more detail, and build a simple, scientifically clean bridge between popular scientific accounts and the technical literature. No knowledge of mathematics, physics or computer science is required. ... Book Subtitle: Spooky Action at a Distance and Teleportation Easy to ...

  17. Quantum Computing and the Financial System: Spooky Action at a Distance

    This is a so counterintuitive phenomenon that Albert Einstein famously derided entanglement as "spooky action at a distance" (Macmillan, 1971). By entangling qubits, the number of represented states rises exponentially, making it possible to explore a huge number of possibilities instantly and conduct parallel calculations on a scale that ...

  18. Spooky Action at a Distance: The Spine-Chilling Science of Quantum

    Experiments have proven this spooky action-at-a-distance is real, and it reveals something deep and non-intuitive about our quantum reality. ... Practically, it also allows new quantum technologies like unhackable cryptography and quantum computing. But at its core, Bell's theorem definitively proved that to make sense of the tiny quantum ...

  19. Spooky action at a distance (D)

    > Quantum Computer Science > Spooky action at a distance; Quantum Computer Science. An Introduction. Buy print or eBook [Opens in a new window] Book contents. Frontmatter. Contents. ... Spooky action at a distance; N. David Mermin, Cornell University, New York; Book: Quantum Computer Science;

  20. Quantum Entanglement: Spooky Action at a Distance

    spooky action at a distance that bothered Einstein and oth- ... Quantum Computing. Springer, New Y ork, 1998. [6] C. P. Williams. Explorations in Quantum Com-puting. 2nd Edition.

  21. Quantum entanglement

    Quantum entanglement is the phenomenon that occurs when a duet of particles are generated, interact, or share spatial proximity in such a way that the quantum state of each particle of the group cannot be described independently of the state of the others, including when the particles are separated by a large distance. The topic of quantum entanglement is at the heart of the disparity between ...

  22. (PDF) Quantum Computing and the Financial System: Spooky Action at a

    The era of quantum computing is about to begin, with profound implications for the global economy and the financial system. Rapid development of quantum computing brings both benefits and risks.

  23. Quantum Computing

    What happens when we can't link physical cause and effect between two actions? Well, quantum bits (or qubits) do this all the time. Let's look into how quant...

  24. Quantum computing is taking on its biggest challenge: noise

    IBM's quantum engineers had been busy building quantum versions of the classical computer's binary digit, or bit. In classical computers, the bit is an electronic switch, with two states to ...