Sunday 22 March 2015

The uncertainty principle and wave-particle duality are equivalent

In the 17th century, physicists debated over the true nature of light. When he observed that light is split into different colors by a prism of glass, Isaac Newton hypothesized that light is composed of particles he called corpuscles, which undergo refraction when they accelerate into a denser medium.

At around the same time, Christian Huygens proposed that light is made up of waves. In his treatise published in 1690, he described how light propagated by means of spherical waves, and explained how reflection and refraction occurs.

In 1704, Newton published Opticks, expounding on his corpuscular theory of light. The debate raged over whether light was a particle or a wave for almost a century, and was not settled until Thomas Young's interference experiments with double slits, which could only be explained if light was a wave.

The story did not end there. In 1901, Max Planck was able to explain the energy curve of blackbody radiation by supposing that light was emitted in small packets of energy. Planck thought of this light particles, or quanta, as a convenient mathematical device and did not believe them to be real. However, when the photoelectric effect was discovered in 1905, Albert Einstein showed that it could be explained in terms of wave packets of light we now call photons. In 1927, Louis de Broglie constructed a pilot wave theory that attempted to explain how particle and wave aspects of light can coexist.
We now know that this ability to exhibit wave and particle behavior is  a property of any quantum system, so it is also true of things like electrons or, atoms.  But there is a trade-off between observing particle aspects and wave aspects . For example, in the double slit experiment, trying to determine which slit a photon goes through results in interference fringes that are less visible. Mathematically, this is expressed as the Englert-Greenberger-Yasin duality relation

$D^2 + V^2 \le 1$,

which relates the visibility $V$ of interference fringes to the distinguishability $D$ of the photon's path in a double slit (or any other interference) experiment.

In 1927, Niels Bohr formulated the principle of complementarity, which asserts that certain pairs of properties of quantum systems are complementary, in the sense, that they can not be both measured accurately in a single experiment. 

Bohr conceived of the idea of complementarity from discussions with Werner Heisenberg regarding his then newly-discovered uncertainty principle. The uncertainty principle expressed a fundamental limit to how much certain pairs of properties such position and momentum, can be jointly measured. The formal inequality relating the standard deviation of position and momentum was first derived by Earle Hesse Kennard and Hermann Weyl. The more general version commonly used today is due to Howard P. Roberston:

$\mathrm{stdev}(X) \ \mathrm{stdev}(Z) \ge  \dfrac{| \langle [X,Z] \rangle |}{2}$,

where $X$ and $Z$ are complementary operators, $[X,Z] = XZ- ZX$ is called the commutator, $\mathrm{stdev}(A)$ denotes the standard deviation of $A$, and $\langle A \rangle$ denotes the expectation value of $A$. When $X$ denotes position and $Z$ denotes momentum, the right-hand-side is equal to $1/2$.

From Bohr's persepective, wave-particle duality is an expression of quantum complementarity, qualitatively no different from how the position and momentum of photons cannot be simultaneously determined in  a double slit apparatus.  Yet Berge Englert showed in 1996 that the duality relation can be derived without using any uncertainty relations, which seems to imply that they are logically independent concepts.

It was only recently, in 2014, that Patrick Coles, JÄ™drzej Kaniewski, and Stephanie Wehner showed how wave-particle duality and the uncertainty principle are formally equivalent if one uses modern entropic versions of uncertainty relations. This interesting result, which unifies two fundamental concepts in quantum mechanics is explored here.



Entropy as measure of uncertainty


As we have seen above, Heisenberg uncertainty relations are typically expressed in terms of standard deviation. However, information theory has taught us that uncertainty is best measured in terms of entropy. Before we state the entropic version of the uncertainty principle, it maybe helpful to first go into a short digression into how entropy is defined.

Consider the experiment of rolling a dice. It has 6 possible outcomes, which we denote by $x$
. If the probability of getting $x$ is $p(x)$, the amount of information in it can be measured by

$I(x) = \log[1/ p(x)]$

where here log is a base-2 logarithm, i.e., $\log(2^y) = y$. The formula $I(x)$ is called the surprisal of $x$, since observing an improbable outcome ($p(x)$ is small) is quite surprising (its corresponding $I(x)$ is big).

Entropy (denoted by $H$) is formally defined as the expectation value of the surprisal:

$H(X) = \sum_x p(x) \log [1 /p(x)]$,

where $X$ is the random variable that takes values $x$. It maybe helpful to think of $X$ as representing the outcome of some type of measurement.

For example, if $X$ describes the outcome of a dice roll, the entropy of $X$ is given by

$H(X) = p(1) \log (1 / p(1) ) + ... + p(6) \log (1 / p(6) ).$   

If we believe the dice to be fair, then all possible outcomes are equally probable, i.e.,  $p(1) = p(2) = ... = p(6) = 1/6$. The entropy in this case becomes

$H(X) = 6 (1/6) \log [1/(1/6)] = \log(6).$

If the probabilities are different, the entropy will become smaller than $\log(6)$. Suppose we have a loaded die that always turns up 1, then $p(1) = 1$, while the rest has probability 0. In that case, $H(X) = \log(1) = 0$, i.e., the entropy vanishes when the outcome of a roll is certain. 

This is why we say entropy measures uncertainty: its value is the largest when we think that all possibilities are equally likely to happen, i.e, when we are completely unsure of the outcome, and zero when we know what the outcome will be.



Entropic uncertainty relations


Entropic versions of the Heisenberg uncertainty relation were first considered by Isidore Hirschmann in the context of Fourier analysis in the 1950s., with the connection to quantum mechanics mostly established in the 1970s. The most well-known version today was proven by to Hans Maassen and Jos Uffink in 1988:

$H(X) + H(Z) \ge - 2 \ln c$

where $X$ and $Z$ refer to measurements with outcomes $x$ and $z$, respectively, and

$c = \max_{i,j}   | \langle x | z  \rangle|$

where $|x \rangle $ is the quantum state corresponding to outcome $x$ of measurement $X$, $|z \rangle $ is the quantum state corresponding to outcome $z $ of measurement $Z$, and $\langle x | z\rangle $ is the inner product between the two vectors. 

For instance, if $X = Z$, then $c = 1, \ln c = 0$, and we get $H(X) \ge 0$. Of course, all this tells us is that it is possible to be certain of the outcome of measurement $X$.

Conceptually what the Maassen-Uffink relation means is that the combined uncertainty of the outcomes of any pair of measurements is always larger than some quantity that depends only a particular pair of outcomes from each measurement that are hardest to distinguish from each other ($c$ is largest when $|x \rangle = |z \rangle$ ).

Actually, what Maassen and Uffink derived was a more general relation that involved a more general definition of entropy called Renyi entropy:

$H_a = (1/(1-a)) \log [ \sum_x   p(x)^a ].$

The more general Maassen-Uffink relation reads

$H_a (X) + H_b (Z) \ge= -2 \ln c,$        

where we must have $1/a + 1/b = 2$. The special case for the usual (Shannon) entropy is obtained when $a = b = 1$.
          
The entropic uncertainty relation that is equivalent to the duality relation is 

$H_\mathrm{min} (X) + H_\mathrm{max} (Z)   \ge 1$,

which can be obtained from the general Maassen-Uffink relation when $a = \infty, b = 1/2$.

$H_\infty$ is called the min-entropy or $H_\mathrm{min}$ because it is always the smallest one among all Renyi entropies. $H_{1/2}$ is called $H_\mathrm{max}$ to contrast it with $H_\mathrm{min}$, and also because $b=1/2$ is the largest Renyi entropy allowed by the relation.

Although $H_\mathrm{max}$ and $H_\mathrm{min}$ sound rather technical, they have relatively simple operational meanings in cryptography. $H_\mathrm{min} (X)$ is related to the probability of guessing $X$  correctly using some optimal strategy, where the harder it is to guess the value of $X$, the larger its min-entropy gets. $H_\mathrm{max} (Z)$ describes the amount of secret information contained in $Z$. 

Thus, in cryptographic terms, the entropic uncertainty relation describes a certainty-confidentiality trade-off between $X$ and $Z$: if $X$ is easy to guess then $Z$ is more concealed but if $X$ is hard to guess then $Z$ is more exposed. Intuitively, this tells us we can only determine $X$ at the expense of $Z$, and vice-versa. 

Now we are ready to see that the same intuition holds if $X$ refers to some particle behavior of a quantum system and $Z$ refers to its wave behavior.


The complementary guessing game



Complementary guessing game. For each photon detected on the screen in the double-slit experiment, Alice's goal is either to determine which slit the photon passed through or which position was used for the source. Because the questions involve the complementary wave and particle behavior of light, trying to one behavior limits our ability to observe the other.


Consider the double-slit experiment depicted above. We have a light source on the left that emits photons onto 2 narrow slits and interference fringes can be seen at a screen placed far beyond the slits. The idea here is to describe an individual photon as a qubit with 2 possible states depending on whether it is "made" to behave as a particle or as a wave.

Let $P$ describe the path a photon traverses, where $|1 \rangle$ refers to the top slit and $|2 \rangle$ refers to the bottom slit. $P$ exhibits the particle behavior of light (since the two possible states tells us if light passes through the top or bottom slit).

To get two different wave behaviors, the source can be moved along the up-down direction, between two positions that we have labelled $|t\rangle $ and $|b \rangle$. If we write them in terms of the path states we would get

$|t) = [ |1 \rangle + e^{i g} |2\rangle ] / \sqrt{2}$, 
$|b) = [ |1 \rangle - e^{i g} |2 \rangle ] / \sqrt{2}$.

where $g$ is some angle called the phase. Let $W$ take the values $|t \rangle$ and $|b \rangle$. $W$ exhibits the wave behavior of light (since its two possible states refers to light that passes partially through both slits).

(The basic idea is that particles have well-defined locations while waves have well-defined phases.)

Consider a guessing game where Alice chooses whether to run a particle experiment or wave experiment by deciding where the source is placed. Her task is to decide what is the state of each photon, that is, she needs to determine the value of $P$ if the source is kept at the middle, and the value of $W$ if the source is moved up or down. 

The following entropic uncertainty relation restricts Alice's ability to guess 

$H_\mathrm{min} (P) + \min\{ H_\mathrm{max} (W) \} \ge 1.$

Here, it is necessary to take the minimum of  $H_\mathrm{max}$ because when Alice runs a wave experiment to guess $W$, there are many possible choices for $g$. However, she can always determine $g$ when measuring the visibility of the interference pattern. So mathematically, finding $g$ corresponds to finding that minimum value of $H_\mathrm{max}$.

Now we can go back to the wave-particle duality relations. There are actually many versions of duality relation, which vary according to the specific details of the particle or wave experiments used in the guessing game. 

In the simplest version, the distinguishability $D$ refers to Alice's ability to predict the path in a particle experiment, while the visibility $V$ is related to the probability the photon is detected at a particular spot on the screen.

Coles, Kaniewski, and Wehner showed that in terms of the variables $P$ and $W$ in our guessing game above, we would have

$H_\mathrm{min} (P) = - \log [ (1 + D) / 2]$,
$\min \{ H_\mathrm{max} (W) \} = \log [ 1 + \sqrt{1 - V^2 } ].$

(The above formulas are derived using many tools of quantum information theory, so unfortunately that goes well beyond what we can explain here concisely.)

From the formulas we observe that when $D$ is big, $H_\mathrm{min}$ is small, which is what we expect since high $D$ means the photon path is easy to guess. Meanwhile, when $V$ is big, $H_\mathrm{max}$ is small, which again makes sense since when the interference fringes have low visibility, then the value of the phase becomes well-hidden.

From the entropic uncertainty relation, we obtain

$- \log [ (1 + D)/2] + \log [1 + \sqrt{1 - V^2}] \ge 1.$

Using both sides as the exponent for 2 (since $2^{\log(x)} = x$) gives

$\left(\dfrac{2}{1+D}\right)  [ 1 + \sqrt{1 - V^2}] \ge 2.$

Working out the algebra, we find that

$1 + \sqrt{1 - V^2} \ge 1 + D  \quad \implies \quad    1 \ge D^2 + V^2$,

which is indeed (one of) the wave-particle duality relation(s). 

This illustrates that duality relations are in fact, entropic uncertainty relations in disguise. I suspect Niels Bohr would have been quite happy with such a result.


Reference:

P. Coles, J. Kaniewski, and S. Wehner, "Equivalence of wave-particle duality to entropic uncertainty," Nature Communications 5, 5814 (2014).








No comments:

Post a Comment