The Quest for Quantum Gravity: The Stubborn Offspring of Quantum Field Theory and General Relativity

Meet the Parents of Quantum Gravity: Quantum Field Theory and General Relativity

The Quest for Quantum Gravity: The Stubborn Offspring of Quantum Field Theory and General Relativity: imageQuantum Field Theory (QFT) and General Relativity (GR) form the theoretical and mathematical foundations for modern physics and cosmology.  QFT is an extension of Quantum Mechanics (QM), accounting for creation and annihilation of particles.  The primary entities in QFT are fields rather than particles, and it can be expressed in a Lorentz-invariant form, consistent with Einstein’s Special Theory of Relativity.  GR is, of course, Einstein’s brain child that explains gravity as the curvature of space and time, induced by matter and energy.  GR enabled Einstein to correctly calculate the magnitude of the precession of Mercury’s perihelion and the deflection of light by the Sun, and almost enabled him to predict the expansion of the cosmos.

These two paradigms, QFT and GR, have enjoyed unprecedented success in their range of validity, precision of experimental verification, and the amazing technologies that they have made possible.  However, many questions remain unanswered.  Puzzles include: what was the physics of the early universe and the pre-universe, what is dark matter, what is dark energy, what is the origin and nature of spacetime, what goes on at the horizon of a black hole and at a black hole singularity, how can gravity be united with the other three forces in a unified theory, what is the role of gravity in quantum decoherence?  Answering these questions may require finding a more general theory that merges QFT and GR into a unified framework encompassing both paradigms, a theory known as Quantum Gravity (QG).

QFT is essentially the theory of the very small, where quantum effects dominate and gravity can be ignored because it is so weak. GR is essentially the theory of the very large or heavy, where gravity dominates and quantum effects disappear.  A theory of QG must be able to predict and explain situations where both quantum effects and strong-field gravity are important.  Quantum Gravity in under five minutes:

The Apparently Incompatible Natures of Quantum Gravity’s Parents

QFT and GR are founded on seemingly different premises for how the universe works.  For example, in QFT, particle fields are embedded in the flat (Minkowski) spacetime of Special Relativity.  In GR, time flows at different rates depending on the spacetime geometry.  And gravity is due to the curvature of spacetime, which changes as gravitational masses move.  The most straight-forward ways of combining the two theories by quantizing gravity are non-renormalizable.  This means that calculations run away to infinity and cannot be tamed through a redefinition of certain parameters, as is done in QFT.

This problem is related to the fact that all particles attract each other gravitationally, and energy as well as mass create spacetime curvature.  When quantizing gravity, there are infinitely many independent parameters needed to define the theory.  At low energies, this form of quantum gravity reduces to the usual GR.  But, at high energies (small distance scales), all of the infinitely many unknown parameters are important and predictions become impossible.

A workable theory of quantum gravity must make use of some deep principle that reduces the infinitely many unknown parameters to a finite and measureable number.  Attempts at a workable theory of quantum gravity include string theory, loop quantum gravity, non-commutative geometry, causal dynamical triangulation, and a holographic universe.  Of course, which hypothesis you prefer is not a decision to be taken lightly:

The challenge of uniting QFT and GR is further compounded by the lack of experimental results that could point to a breakdown of either QFT or GR; or results from experiments that are sensitive to both theories.   Scientists are turning to a variety of astrophysical as well as table top experiments to address this issue.

Searching for Common Ground Between Quantum Field Theory and General Relativity

Testing the predictions of quantum theory on macroscopic scales is one of the outstanding challenges for modern physics.  Some experiments are not tests of a specific theory of quantum gravity, per se.  Rather, they look for a deviation from some fundamental tenet of either QFT or GR, with the hope that this will guide theorists in how to supplant either QFT or GR.  Other experiments attempt to create or observe conditions that are sensitive to both theories, to see how they play together.

Common to many philosophical or phenomenological approaches to QG is the possibility that fundamental symmetries, essential in our current understanding of the universe, may not hold at extremely small distance scales or high energy scales, due to a discrete structure of spacetime. Or, perhaps these symmetries do not hold in a highly curved spacetime with boundaries, such as in the vicinity of a microscopic black hole or the cosmological horizon of an inflationary universe.

These symmetries include Lorentz Invariance (LI) and CPT symmetry (charge conjugation – parity transformation – time reversal).  Lorentz invariance means that a property or process remains invariant under a Lorentz transformation. That is to say, it is independent of the coordinate system and independent of the location or motion of the observer, and the location or motion of the system.  CPT symmetry requires that all physical phenomenon are invariant under the combined operations of charge conjugation (swapping matter and antimatter), parity transformation (reflection in a mirror), and time reversal (viewing the process in reverse).

The IceCube South Pole Neutrino Observatory has weighed in on this issue, setting extremely tight limits on a possible violation of Lorentz Invariance.  Neutrinos, lacking strong or electromagnetic interactions and moving at essentially the speed of light (due to their teeny, tiny, and as-yet un-measureable, mass), are sensitive probes of these effects.  IceCube uses data from interactions of high energy atmospheric and astrophysical neutrinos in the South Pole ice.  See Search for a Lorentz-violating sidereal signal with atmospheric neutrinos in IceCube”, Stringent constraint on neutrino Lorentz-invariance violation from the two IceCube PeV neutrinos, and Probing Planck scale physics with IceCube.

The Fermi Gamma-ray Space Telescope is also a member of this club, using photons rather than neutrinos: Constraints on Lorentz Invariance Violation with Fermi-LAT Observations of Gamma-Ray Bursts and Constraints on Lorentz Invariance Violation from Fermi -Large Area Telescope Observations of Gamma-Ray Bursts.

Another sweet spot is the equivalence principle (EP), which provides the foundational basis for GR. The EP is the idea that the effects of acceleration are indistinguishable from the effects of a uniform gravitational field. The EP requires that gravitational and inertial mass are equivalent; that a particle’s coupling to a gravitational field is equal to its inertial mass.  See, for example, Expanded solar-system limits on violations of the equivalence principleor A millisecond pulsar in a stellar triple system.

Foundational Principles of Quantum Mechanics and the Cosmic Microwave Background

I have previously discussed the resurgence of de Broglie-Bohm mechanics, despite its historical neglect, in Hydrodynamic Quantum Analogs”.

In Beyond the Quantum, Antony Valentini follows the logical consequences of Louis de Broglie’s pilot wave theory to predict evidence of quantum non-equilibrium in the Cosmic Microwave Background (CMB).   Pilot-wave theory makes use of hidden variables.  The canonical interpretation of quantum mechanics says that there are no well-defined trajectories.  But in pilot-wave theory, these hidden variables describe the trajectories for whatever particles or fields a system may contain.  They can also explain the apparently random outcomes of quantum measurements.

Pilot-wave theory gives the same observable results as conventional quantum theory if the hidden variables have a particular distribution, a quantum equilibrium distribution, analogous to an ensemble of particles being in a thermal equilibrium.  But, as Valentini points out, there is nothing in de Broglie’s dynamics that requires this assumption to be made.  When the hidden variables have an equilibrium distribution, superluminal signaling is not possible; any attempted non-local signals would average out to zero.  However, if the hidden variables are not in an equilibrium distribution, superluminal signals may become controllable and observable! Relativity theory would be violated; time would be absolute rather than relative to each observer!

To help understand this, Valentini provides an analogy with classical physics:

“…For a box of gas, there is no reason to think that the molecules must be distributed uniformly within the box with a thermal spread in their speeds. That would amount to restricting classical physics to thermal equilibrium, when in fact classical physics is a much wider theory. Similarly, in pilot-wave theory, the `quantum equilibrium’ distribution – with particle positions distributed according to the Born rule – is only a special case. In principle, the theory allows other `quantum non-equilibrium’ distributions, for which the statistical predictions of quantum theory are violated – just as, for a classical box of gas out of thermal equilibrium, predictions for pressure fluctuations will differ from the thermal case. Quantum equilibrium has the same status in pilot-wave dynamics as thermal equilibrium has in classical dynamics. Equilibrium is a mere contingency, not a law.

…It seems natural to assume that the universe began in a non-equilibrium state, with relaxation to quantum equilibrium taking place during the violence of the Big Bang.

…The crucial question is whether the early non-equilibrium state could have left traces or remnants that are observable today.”

Quantum non-equilibrium at the onset of inflation would modify the spectrum of anisotropies (differences from place-to-place) in the CMB sky.  Hence, measurements of the CMB can test for the presence of quantum non-equilibrium during the inflationary phase.

See also: Samuel Colin and Antony Valentini, Mechanism for the suppression of quantum noise at large scales on expanding space, where the authors present numerical simulations showing how the expansion of space can slow down the relaxation to quantum equilibrium in the super-Hubble regime:

“Given these results it is natural to expect a suppression of quantum noise at super-Hubble wavelengths. Such suppression could have taken place in a pre-inflationary era, resulting in a large-scale power deficit in the cosmic microwave background”.

A variety of tests of fundamental physics, conceivable with artificial satellites in Earth orbit and elsewhere in the solar system, are discussed in David Rideout, et al., Fundamental quantum optics experiments conceivable with satellites — reaching relativistic distances and velocities:

“We propose to push direct tests of quantum theory to larger and larger length scales, approaching that of the radius of curvature of spacetime, where we begin to probe the interaction between gravity and quantum phenomena. …the potential to determine the applicability of quantum theory at larger length scales, eliminate various alternative physical theories, and place bounds on phenomenological models motivated by ideas about spacetime microstructure from quantum gravity.”

Table-Top Tests of Quantum Mechanics and General Relativity

The question of simultaneously observing the effects of quantum physics and GR in a table-top experiment can be framed as simply as this: The idea that particles can be in superpositions of multiple states (states with different trajectories, different spins, different energies, etc.) is an essential feature of quantum mechanics.  If a particle is in a superposition of states with different paths through a gravitational field, for example, the different superpositions should be effected differently by the different trajectories through spacetime.  If a particle is in a superposition of different energy states, these different superpositions should create different gravitational fields.  If a macroscopic object could be placed in a superposition of oscillating and non-oscillating, for example, its gravitational field should also split into a superposition.  What does a superposition of gravitational fields look like and how does it behave?

Unfortunately, quantum superpositions are very delicate. As soon as a particle in a superposition interacts with the environment, it appears to collapse into a definite state (see Decoherence and the Quantum to Classical Transition; or Why We Don’t See Cats that are Both Dead and Alive).  Only tiny particle-sized entities can be maintained in quantum superpositions for any significant period of time.  However, only macroscopic objects have detectable gravitational fields.  So this presents immense technical challenges for experimentalists.  People are working very hard to improve upon these limitations.  See, for example, Brian Pepper, et al, Optomechanical superpositions via nested interferometry and Macroscopic superpositions via nested interferometry: finite temperature and decoherence considerations.

Magdalena Zych and his colleagues are searching for evidence of gravitationally-induced time dilation and its effects on the phase of a quantum state: Quantum interferometric visibility as a witness of general relativistic proper time, (also available here).  They propose using a Mach-Zehnder interferometer (MZI) in a gravitational field.  According to GR, proper time flows at different rates in different regions of spacetime.  Their proposed experiment requires a particle with evolving internal degrees of freedom, such as spin or internal vibrations, that can act as a clock.  And the two different legs of the MZI are at different gravitational potentials.

Table-Top Tests of Quantum Mechanics and General Relativity: image

Difference between probabilities to find the particle in different outputs of the Mach–Zehnder interferometer as a function of the time ΔT for which the particle travels in a superposition of two trajectories (corresponds to changing the length of the interferometric arms). Without the ‘clock’ degrees of freedom, the dashed, black line would be the result. With the ‘clock’ and the predictions of GR, the predicted result is the blue line. From “Quantum interferometric visibility as a witness of general relativistic proper time”.

If there is a difference in proper time elapsed along the two legs of the interferometer, the particle’s internal clock will evolve into two different quantum states.  This is a consequence of the prediction that the clock ticks at different rates when placed in different gravitational potentials.  As a result of the quantum complementarity between interference and which-path information (in the form of the different internal clock values), the general relativistic time dilation will cause a decrease in the interferometric visibility (see the adjacent figure).

“Such a reduction in the visibility is a direct consequence of the general relativistic time dilation, which follows from the Einstein equivalence principle. Seeing the Einstein equivalence principle as a corner stone of general relativity, observation of the predicted loss of the interference contrast would be the first confirmation of a genuine general relativistic effect in quantum mechanics.”


This has been just a sampling of the work underway to pry nature’s secrets from her grasp.  For theorists and experimentalists, working on the interplay between QFT and GR with the ultimate goal of creating a theory of QG, is one of the most challenging and stimulating areas of research in fundamental physics.  If this brief discussion has piqued your interest, let me know.  I can point you towards more resources concerning the theoretical and experimental work taking place on the road to quantum gravity.

Quantum Cheshire Cats: Blurring the Distinction Between Science and Science Fiction

What Would Happen if a Quantum Cheshire Cat Were to Visit the Leisure Hive?

Quantum Cheshire Cats: Blurring the Distinction Between Science and Science Fiction: imageHappy Holidays, Everyone!  Today’s article, just in time for your New Year’s Eve party, is on something extremely cool.  It has to do with a paradox that is completely unintuitive and that is only revealed by weak measurements.  A particle and its properties can be in different locations!

In the classic Doctor Who episode Leisure Hive, a so-called “science of tachyonics” serves as the basis for entertaining guests at a resort.  A person enters a booth and their head and limbs are seemingly separated from their body, yet remain animated and are then harmlessly reattached.

That is, of course, full-fledged science fiction.  However, a quantum particle such as a photon, an electron, or an atom, apparently can have its properties located in a position separate from the particle itself.

Recent theoretical and experimental work has invigorated the search for “quantum Cheshire cats”.  Before I continue, however, I want to stress that the reference to cats is strictly metaphorical.  Just as with the case of Schrödinger’s cat (Decoherence and the Quantum to Classical Transition; or Why We Don’t See Cats that are Both Dead and Alive), decoherence prevents macroscopic objects from displaying these quantum mechanical properties.

In Search of a Quantum Cheshire Cat

The authors of Quantum Cheshire Cats (also available here) define a “quantum Cheshire cat” as a photon that is in one location while its circular polarization is in another.  The metaphor comes from the Cheshire cat in the story of Alice in Wonderland, whose smile persists independent of the cat:

The “cat” is the photon and its “smile” is the photon’s circular polarization state.  The photon is in one of two possible locations, the left or right side of a modified Mach-Zehnder interferometer.  Using weak measurements, including cleverly chosen pre-selected and post-selected states, leads to a sample of events where the photon went through the left arm with certainty.  However, a polarization detector in the right arm can still see a signal!

“We seem to see what Alice saw—a grin without a cat! We know with certainty that the photon went through the left arm, yet we find angular momentum in the right arm.”

The paradox is removed if conventional, strong measurements of position and polarization are performed.  The inevitable and apparent wave function collapse occurs and the photon’s position and angular momentum are found to be co-located.  This is analogous to directly measuring which slit the particle goes through in a double slit experiment, which prevents an interference pattern from forming.  Strong measurements are analogous to turning the light on and letting the cockroaches quickly scurry into hiding.  Everything looks normal.  But, weak measurements are like peering at what is going on in the dark, without scaring the roaches away.

Using weak measurements (The Strength of Weak Measurements in Quantum Physics), the disturbance on the state of the system can be reduced by accepting less precision.  Then, the measurement is repeated many, many times to achieve the desired accuracy.  This reveals that the circular polarization was in fact in the right leg of the interferometer while the photon was in the left, for certain pre- and post-selected events.

What Do We Do With a Quantum Cheshire Cat Once We Catch One?

Conventional wisdom is that when you look at, or measure, a quantum system, the wave function collapses into something that makes sense from a classical level.  That is to say, strange or apparently contradictory paradoxes disappear.  However, that assumes strong measurements.  Until weak measurements were explored theoretically and experimentally in recent years, the distinction between strong and weak measurements was not appreciated.

Contemplating the implications of quantum Cheshire cats opens up several mind-boggling possibilities and opportunities.  Separating physical properties, such as mass, energy, charge, magnetic moment, etc., from what we conventionally understand to be a particle could lead to new and more precise measurements, new technologies, new materials…  Additionally, it has profound implications for our conceptual understanding of quantum physics and what a quantum system is up to between measurements or between interactions.  Scientists will be exploring this amazing field for many years to come.

Additional information:

In Quantum Cheshire Cats, the authors discuss a couple modifications (beyond the reach of existing technology, but should be possible eventually), where the signature of a quantum Cheshire cat should be unambiguous; ensembles of electrons, for example.

Proposed modifications to the setup discussed above, i.e. using entangled pre- and post-selected states to allow the linear as well as the circular polarization states to be separated from the photon, are discussed in The Complete Quantum Cheshire Cat.

Possible hints of the metaphorical quantum Cheshire cat have been seen: Observation of a quantum Cheshire Cat in a matter wave interferometer experiment  “…using a neutron interferometer… The experimental results suggest that the system behaves as if the neutrons went through one beam path, while their spin travelled along the other.”

The quantum Cheshire cat is an example of an interaction free measurement.  Another example is the Elitzur–Vaidman bomb tester, also known as a quantum mechanical bomb tester.  Also see Using quantum mechanics to detect bombs.

Masters student Catherine Holloway lectures on the science behind a quantum bomb detector at the Quantum Cryptography School for Young Students, held at the Institute for Quantum Computing, University of Waterloo:

Decoherence and the Quantum to Classical Transition; or Why We Don’t See Cats that are Both Dead and Alive

Conflating Science with Pseudoscience

Galileo Cartoon for Decoherence and the Transition from Quantum to Classical article: imageThe spreading of misinformation and misconceptions about the quantum world can be lumped into two different categories.  The first category are people who mean well, who want to advance science and scientific understanding.  Maybe they write a book, give public lectures, or create news articles about recent events in quantum science, for examples.  However, they use misleading analogies, miss essential features, fail to properly address alternatives to a failing orthodoxy, or mischaracterize apparently paradoxical phenomenon.  As a result, they end up misleading or confusing the general public or their students.  Another failure mode within this category is the use of excessive hype.  Due to their own passions or the desire to spread the excitement of physics, they mislead about the implications of quantum physics in general.  They over-promise when describing the latest incremental step in theoretical or experimental physics; or they mislead about the nature of reality.

The second category is just plain fraudulent; people who deliberately make things up to deceive others for profit.  Prominent examples of this include books and talks like the ones by Deepak Chopra, and movies like What the Bleep Do We Know!?  Rest assured, there is no such thing as quantum healing.  You cannot change your quantum state through your thoughts.  Real harm is done by these quacks when, for example, someone forgoes proven medical treatments for pseudoscience.

My contention is that because we do not do enough to mitigate the negative impact of the first category, the fraudulent category is able to spread easily and quickly amidst fertile grounds.  The public is susceptible to charlatans peddling pseudoscience and quackery by throwing in sciency sounding phrases, and references to quantum physics that no one (including themselves) understands.  Moreover, their claims have no relationship to reality.

There will always be a certain number of people eager to believe whatever pseudoscience or pseudo-religion these hucksters want to sell.  But, if we want to influence the fraction of the public that is interested in separating fact from fantasy, we need to be clearer and more precise in our own presentations of physics.  Moreover, if we want to retain our credibility with the general public as we seek to dispel the drivel these hucksters distribute, we need to make sure we are precise about what QM is and what it is not, what we understand about it and what we do not.

Misconceptions about the Quantum to Classical Transition

Schrodingers_cat_experiment: image

Experimental setup for the Schrodinger’s cat thought experiment. Image from Wikipedia.

One example that contributes to the confusion is the parable of Schrödinger’s cat.  A cat, a flask of poison, and a radioactive source are placed in a sealed box (this is a hypothetical thought experiment, of course – no cats were harmed…).  If an internal monitor detects a single atom decaying, the flask is shattered, releasing the poison that kills the cat.  Naïve application of the Copenhagen interpretation of quantum mechanics leads to the conclusion that the cat is simultaneously dead and alive.  Up until it is measured by a conscious observer, the atom is in a superposition of having decayed and not decayed.  And this superposition allegedly extends to the radiation detector, the vial of poison, the hammer to break the vial, the cat, the box, and to you as you wait to open the box.

People trot out Schrödinger’s cat whenever they want to tout how strange QM is.  “See how weird and paradoxical QM is, how bizarre and unintuitive it’s predictions, how strange the universe is?  Anything is possible with quantum mechanics, even if you don’t understand it or I can’t explain it.”   No, quantum mechanics is not an “anything goes” theory.  A cat cannot be simultaneously dead and alive, regardless of whether or not we observe it.

References to the role of the observer or of consciousness in determining outcomes contributes to this mess.  Even in interpretations of QM that refer to a special role for an observer or a consciousness (interpretations that I believe miss the target of reality), the observer cannot control or manipulate outcomes by choice or thought.  He/she is merely triggering an outcome to become reality; the particular outcome that nature chooses is still random.  You cannot decide to pick out a different wave function for yourself.  Additionally, interpretations of QM that do not have any need for a special role for a conscious observer (and are thus, in my opinion, better approximations of reality) are readily available.  See, for example, the Transactional Interpretation.

Isolating the Environment in Classical Physics

In “Decoherence, einselection, and the quantum origins of the classical, Wojciech Zurek had this to say:

“The idea that the “openness” of quantum systems might have anything to do with the transition from quantum to classical was ignored for a very long time, probably because in classical physics problems of fundamental importance were always settled in isolated systems.”

For centuries, progress in our understanding of how the world works has been made by isolating the system under study from its environment.  In many experiments, the environment is a disturbance that perturbs the system under investigation and contaminates the results of the experiment.  The environment can cause unwanted vibrations, friction, heating, cooling, electrical transients, false detections, etc.  An isolated system is an idealization where other sources of disturbance have been eliminated as much as possible in order to discover the true underlying nature of the system or physical properties under investigation.

Portrait_of_Galileo_Galilei for quantum decoherence and transition from quantum to classical articleGalileo Galilei is considered by many to be the founding father of the scientific method.  By isolating, reducing, or accounting for the secondary effects of the environment (in actual experiments and in thought experiments) he discovered several principles of motion and matter.  These principles, such as the fact that material objects fall at the same rate regardless of mass and what they are made of, had been missed or misunderstood by Galileo’s predecessors.  A famous example is the experiment where Galileo dropped two metal balls of different size, and hence different mass, from the top of a building (supposedly the leaning tower of Pisa). Luckily, the effects of air resistance were negligible for both balls, and they hit the ground at roughly the same time.  He would not have been able to do the experiment with a feather and a steel ball, for example, because air resistance has a much more dramatic effect on the light feather than on the steel ball.  Interesting bit of physics why that is the case, but I’ll avoid the temptation to take that detour for now.

During an Apollo 15 moon walk, Commander David Scott performed Galileo’s famous experiment in a live demonstration for the television cameras (see the embedded video below). He used a hammer (1.32 kg) and a feather (0.03 kg; appropriately an eagle feather).  He held both out in front of himself and dropped them at the same time.  Since there is no atmosphere on the moon (effectively, a vacuum) there was no air resistance and both objects fell at the same rate.  Both objects were observed to undergo the same acceleration and strike the lunar surface simultaneously.

Superposition and Interference: the Nature of Quantum Physics

The situation is quite different in quantum mechanics.  First of all, the correlations between two systems can be of fundamental importance and can lead to properties and behaviors that are not present in classical systems.  The distinctly non-classical phenomena of superposition, interference, and quantum entanglement, are just such features.  Additionally, it is impossible to completely isolate a quantum system from its environment.

According to quantum mechanics, any linear combination of possible states also corresponds to a possible state.  This is known as the superposition principle.  Probability distributions are not the sum of the squares of wave function amplitudes.  Rather, they are the square of the sums of the wave function amplitudes.  What this means is that there is interference between possible outcomes.  There is a possibility for outcome A and B, in addition to A or B, even though our preconceived notions, based on our classical experiences of everyday life, tell us that A and B should be mutually exclusive outcomes.  Superposition and the interference between possible states leads to observable consequences, such as in the double-slit experiment, k-mesons, neutrino oscillations, quantum computers, and SQUIDS.

We do not see superpositions of macroscopic, everyday objects or events.  We do not see dead and alive cats.  Sometimes, our common sense intuitions can mislead us.  But this is not one of those times.  The quantum world is more fundamental than the classical world.  The classical world emerges from the quantum world.  So what happens that makes these quantum behaviors disappear?  Why does the world appear classical to us, in spite of its underlying quantum nature?

Coherence, and Then Naturally, Decoherence

Two waves are said to be coherent if they have a constant relative phase.  This leads to a stable pattern of interference between the waves.  The interference can be constructive (the waves build upon each other producing a wave with a greater amplitude) or destructive (the waves subtract from each other producing a wave with a smaller amplitude, or even vanishing amplitude).  Whether the interference is constructive or destructive depends on the relative phase of the two waves.  One of the game-changing realizations during the early days of quantum mechanics is that a single particle can interfere with itself.  Interference with another particle leads to entanglement, and the fun and fascinating excitement of non-locality.

Decoherence is the Key to the Classical World

The key to a quantum to classical transition is decoherence.  Maximillian Schlosshauer, in “Decoherence, the measurement problem, and the interpretations of quantum mechanics, states that

“Proponents of decoherence called it an “historical accident” that the implications for quantum mechanics and for the associated foundational problems were overlooked for so long.”

Decoherence provides a dynamical explanation for this transition without an ad hoc addition to the mathematics or processes of quantum mechanics.  It is an inevitable consequence of the immersion of a quantum system in its environment.  Coherence, or the ordering of the phase angles between particles or systems in a quantum superposition, is disrupted by the environment.  Different wave functions in the quantum superposition can no longer interfere with each other.  Superposition and entanglement do not disappear, however.  They essentially leak into the environment and become impossible to detect.

I typically love the many educational and entertaining short videos by Minute Physics. However, the video below about Schrödinger’s cat is misleading.  Well before the cat could enter into a superposition, coherence in the chain of the events leading up to his death (or not) has been lost to the environment.  The existence of a multiverse is not a logical consequence of the Schrödinger’s cat experiment.

Perhaps the muddled correspondence principle of the Copenhagen Interpretation could have been avoided, as well as myths and misconceptions about the role of consciousness and observers, if decoherence had been accounted for from the beginning.

The Measurement Problem

Decoherence occurs because the large number of particles in a macroscopic system are interacting with a large number of microscopic systems (collisions with air molecules, photons from the CMB, a light source, or thermal photons, etc.).  Even a small coupling to the environment is sufficient to cause extremely rapid decoherence.  Only quantum states that are robust in spite of decoherence have predictable consequences.  These are the classical outcomes.  The environment, in effect, measures the state of the object and destroys quantum coherence.

So does decoherence solve the measurement problem?  Not really, at least not completely. It can tell us why some things appear classical when observed.  But, it does not explain what exactly a measurement is and how quantum probabilities are chosen.  Decoherence by itself cannot be used to derive the Born rule.   Additionally, it does not explain the uniqueness of the result of a given measurement.  Decoherence never selects a unique outcome.

The Universe and You

International_Space_Station_after_undocking: image

The International Space Station (ISS). Image from Wikipedia.

With care, mechanical, acoustic, and even electromagnetic isolation is possible.  But, isolating a system gravitationally, i.e. from gravitons, is another challenge.  In orbit around the Earth, like the space shuttle or the International Space Station, you are still in a gravitational field with a flux of gravitons that is not that much different than here on the surface of the Earth.  The apparent weightlessness is due to being in a continuous state of free fall (an example of microgravity).  Various theories have been developed that use the pervasiveness of gravitons to explain certain aspects of our quantum universe.

So, yes, the atoms and subatomic particles in your body are entangled with the universe.  That does not mean that you can do anything about it, or use it to your advantage in any way.  There is no superposition, no coherent relationship between you (1) as a millionaire dating a super model and (2) not a millionaire and not dating a super model.  Sorry about that.


The Transactional Interpretation of Quantum Mechanics

What is so strange about the Transactional Interpretation?

Transactional Interpretation: No stranger than the Multiverse in String Theory imageThousands of physicists are willing to give serious consideration to the notions that the universe contains eleven dimensions, and that there may be something like 10500 universes in the multiverse.  They have been willing to dedicate their careers over the past three or four decades to the study of string theory, despite the lack of experimental support.  So, why are not more physicists willing to take the idea of advanced wave functions more seriously?  What’s wrong with a little backwards time-travel?  After all, that’s what led Dirac, mathematically, to predict antimatter.  We need more physicists exploring the implications of models like the Transactional Interpretation of Quantum Mechanics (TIQM), and trying to develop ideas for testing such alternative explanations for the bizarre and unintuitive behavior of matter on the quantum level.

What is the Transactional Interpretation of Quantum Mechanics?

The wave function is the quintessential component of mathematical descriptions of the quantum world.  It describes the state of a quantum system and the Schrödinger equation describes how the wave function evolves in space and time.  The Schrödinger equation is not relativistically invariant.  However, relativistically invariant equations have been developed; the Klein-Gordon equation and the Dirac equation, for examples.  The solution to these equations that moves forward in time is known as the retarded wave.  Consistent with our common-sense notions of time, this is the one that is assumed to be physically relevant.  But the complex conjugate of a retarded wave is also a solution.  This wave travels backwards in time and is called an advanced wave.  Normally, this advanced wave solution is ignored.

Physicist John Cramer proposed TIQM back in 1986. The TIQM makes use of both the retarded and the advanced waves.  Using the mathematical formality of TIQM, you can calculate and predict the outcomes of the same experimental and natural situations as conventional QM.  And, you arrive at identical quantitative results.  The bonus with the TIQM is that you also get a comprehensible explanation for what physically is going on.  And, you avoid the assumptions, add-ons, and paradoxes, inherent to the canonical interpretation.  TIQM provides an explanation for how nature produces bizarre results in some experiments, results that are consistent with the mathematics of QM but that seem to defy our conceptions of space and time.

Application of the Transactional Interpretation

Transactional Interpretation: Physicists and their sense of humor imageIn one of my earlier posts, Quantum Weirdness: The unbridled ability of quantum physics to shock us, I discussed interaction-free and delayed choice experiments.  TIQM provides a conceptual and physical description of what the universe is up to in these experiments; how it pulls off these seemingly bizarre results.  Quantum interactions are described in terms of a standing wave formed by retarded and advanced waves.  Events require a “handshake” between the emitter and the absorber, a handshake through space and time. This is an explicitly nonlocal model for quantum events.  Nonlocality means that in quantum mechanical systems “relationships or correlations not possible through simple memory are somehow being enforced faster-than-light across space and time.”

Advantages of TIQM over mainstream alternatives include (from Cramer’s A Transactional Analysis of Interaction Free Measurements):

  • it is actually already present in the mathematical formalism of quantum mechanics
  • it is economical, involving fewer independent assumptions
  • it is paradox-free, resolving all of the paradoxes and counter-intuitive aspects of standard quantum theory, including nonlocality and wave function collapse
  • it does not give a privileged role to observers or measurements
  • it permits the visualization of quantum events

In TIQM, a source emits the usual (retarded) wave forward in time.  It also emits an advanced wave backward in time.  A receiver emits an advanced wave backward in time and a retarded wave forward in time.  A transaction is accomplished in three stages: (1) An offer wave (the usual retarded wave function) originates from the source and spreads through space-time until it encounters the absorber.  (2) The absorber responds by producing an advanced confirmation wave (the complex conjugate wave function), which travels in the reverse time direction back to the source. (3) The source chooses between all possible transactions based on the strengths of the echoes it receives.  Then, the potential quantum event becomes reality.  A probability can be calculated for each viable outcome using the wave function amplitudes, in the same manner as the Born rule in conventional interpretations.  The phases of the offer and confirmation waves are such that the retarded wave emitted by the receiver cancels the retarded wave emitted by the sender.  The advanced wave emitted by the receiver cancels the advanced wave emitted by the sender.  Hence, there is no net wave after the absorption point or before the emitting point.

Transaction Interpretation explains interaction free measurements and delayed choice experiments

Transactional Interpretation of Quantum Mechanics and Interaction Free Measurements: Mach-Zehnder open paths imageConsider a Mach-Zehnder interferometer, such as the device discussed in my earlier post (Quantum Weirdness: The unbridled ability of quantum physics to shock us).  A Mach–Zehnder interferometer is used to measure the relative phase shift differences between two collimated photon beams.  The beams are created by splitting light from a single source.  These two figures of a Mach-Zehnder interferometer, and the summary that follows, are from Cramer’s A Transactional Analysis of Interaction Free Measurements.  Please see that reference for a more detailed description and quantitative discussion.  Although Cramer’s paper specifically addresses an experiment with interaction free measurements, similar arguments and calculations apply to delayed choice and quantum eraser experiments.Transactional Interpretation of Quantum Mechanics and Interaction Free Measurements: Mach-Zehnder blocked path image

In a Mach-Zehnder interferometer, light from source L goes to a 50%-50% beam splitter S1 that divides incoming light into two possible paths. These beams are deflected by mirrors A and B, so that they meet at a second beam splitter S2 which recombines them by another reflection or transmission. The combined beams then go to the photon detectors D1 and D2.  Light source L emits only one photon within a given time period.  If paths A and B have identical lengths, the superimposed waves from the two paths are in phase at D1 and out of phase at D2. This is because with beam splitters, a reflected wave is always 90 degrees out of phase with the corresponding transmitted wave. The result is that all photons from light source L will go to (be observed at) D1 and none will be observed at D2.  Walk through the figures and make sure you understand why this is so before proceeding.

Next, use an opaque object to block the lower path (path A). It will insure that all of the light arriving at beam splitter S2 has traveled by path B. In this case there is no interference, and the 50%-50% beam splitter S2 sends equal components of the incident wave into both detectors.  Hence, there is equal probability to observe a photon in either detector.  This is a subtle and important point.  With both paths open, the waves arriving at D2 interfere destructively while the waves arriving at D1 interfere constructively.  Hence no photons are observed at D2.  When path A is blocked, there is no additional wave to interfere with the wave from path B, which is split and sent to both detectors.

Quantitative Application of the Transactional Interpretation

To put some numbers behind this, we can actually calculate the amplitudes of the individual waves.  In the end, we find that TIQM is numerically equivalent to the predictions of conventional QM methodologies.  The difference is in how you interpret (or explain) what the universe is doing and why you calculate it in a particular way.  In TIQM, you account for the effects of splitting, reflecting, transmitting, combining, interfering, etc., on the amplitude and phase of each offer wave and confirmation wave. You then arrive at the following quantitative predictions.  If there is no blockage on path A, we will detect the photon at D1 100% of the time.  If we perform the same measurement with path A blocked, we will detect a photon at D1 25% of the time, a photon at D2 25% of the time, and no photon at all 50% of the time (because it is absorbed by the object in path A).  “…the detection of a photon at D2 guarantees that an opaque object is blocking path A, although no photon has actually interacted with object”.

Consider again the situation in which no object is present in path A. The offer waves from L to detector D1 arrive at D1 with the same amplitudes and in phase with each other.  They interfere constructively, reinforce, and produce a confirmation wave that is initially of amplitude 1.  This confirmation wave then returns to the source by all available paths. Each path brings the confirmation wave to the source L in phase because, as with the offer waves, the confirmation waves on both paths have been transmitted once and reflected twice.  Similarly, the offer waves from L to detector D2 arrive at D2 180 degrees out of phase, because the offer wave on path A has been reflected three times while the offer wave on path B has been transmitted twice and reflected once. Therefore, the two offer waves interfere destructively and cancel at D2, and no confirmation wave is produced as a result.  Since the source L receives a unit amplitude confirmation wave from detector D1 and no confirmation wave from detector D2, the transaction forms from L to D1 via paths A and B. The result of the transaction is that a photon is always transferred from the source L to detector D1 and that no photons are transferred to D2.

When there is an object blocking path A, it is probed both by the offer wave from L and by the aborted confirmation waves from D1 and D2.  When a photon is detected at D1, the object has not interacted with a photon. However, it has been probed by offer and confirmation waves from both sides, modifying the interference relationship at the detectors, and hence the ultimate probabilities. The offer wave along path A never reaches one of the detectors (but it does reach the object). The offer wave on path B reaches both detectors.  The source receives confirmation waves from both detectors (returning along path B) and also from the object.  This leads to the probabilities mentioned above.

Transactional Interpretation: Conclusions

Transactional Interpretation: Time Paradox imageIn the TIQM, interactions are explicitly nonlocal because “the future is, in a limited way, affecting the past (at the level of enforcing correlations)”.  One of the consequences of the TIQM is that it forces us to alter our understanding of essentially all interactions or observations:

“When we stand in the dark and look at a star a hundred light years away, not only have the retarded light waves from the star been traveling for a hundred years to reach our eyes, but the advanced waves generated by absorption processes within our eyes have reached a hundred years into the past, completing the transaction that permitted the star to shine in our direction.”

TIQM offers to resolve many of the paradoxes inherent in QM, such as the mystery of wave function collapse and the awkward role of the observer.  However, it does not specifically answer the question what is the wave function; what is waving.  It also requires you to accept the physicality of advanced waves travelling backwards in time.  To me, there seems to be something important in these ideas.  Something pointing towards a more fundamental theory of the quantum world.  I appreciate the fact that it seeks to offer an explanation.  Additionally, it does not seem to carry as much intellectual or conceptual baggage as some other alternative interpretations (more on this in future posts).

For more information, see John Cramer’s TIQM webpage: The Transactional Interpretation of Quantum Mechanics,  or An Overview of the Transactional Interpretation.   Selected publications by John Cramer can be found here: Research in Theoretical Physics.

Fun with Quantum Computing at University of Bristol

Physics is Fun, at University of Bristol

Run your own quantum computing experiments

Quantum Computing at University of Bristol imageHave fun with quantum physics and quantum computing!  Gain practical experience using the resources offered by University of Bristol: Qcloud.  Test out your quantum experiments in their online quantum processor simulator; includes reference material and users guide.  Then, you can (starting 20 September) register and run your experiment in their lab: “create and manipulate your own qubits and measure the quantum phenomena of superposition and entanglement.