### balls and urn

Consider an urn holding N balls which are either white or black. Whenever I take out a ball it is replaced randomly with either a white or black ball,

with fixed probability p for the replacement ball being white.

I am playing this game for quite a while (so that the start configuration no longer matters) and now I am taking out n white balls in a row (n < N).

What is the probability that the next ball is also white?

If p is (near) 1/2 then one could make these two arguments:

1) If we take out a sequence of n white balls it indicates that there are probably

more white balls in the urn than black balls (due to a random fluctuation in the replacement process), so the next ball is most likely also white: P(white) > P(black).

2) If we take out n white balls, the ratio of white to black necessarily decreases,

so it is more likely that the next is actually black: P(black) > P(white).

What do you think? And does it make a difference if we actually know p and N ?

added later: I have posted the solution now as a comment, but I have to warn you that this is fun only if you really try to find an answer yourself first.

### decoherence

One can still quite often read or hear the argument that

*decoherence*solves the measurement problem and therefore further discussion of interpretations is unnecessary.

Fortunately, I can keep this blog post short with a link to this paper:

Stephen Adler, Why Decoherence has not Solved the Measurement Problem.

-----

If one wants to read about the role of decoherence

*within*different popular interpretations, I recommend this paper:

Maximilian Schlosshauer, Decoherence, the measurement problem, and interpretations of quantum mechanics

It notices that "Decoherence adherents have typically been inclined

towards relative-state interpretations ... It may also seem natural to identify the

decohering components of the wave function with different Everett branches." and it then proceeds

to discuss two important open issues of that interpretation: the preferred-basis problem and the problem with probabilities in Everett interpretations.

If one want to go down that route I recommend this paper for further reading.

### the Coleman argument

In 1994 Sidney Coleman gave the lecture 'Quantum Mechanics in Your Face' which was recorded and the video is available here. Near the end he makes an argument, actually attributed to Dave Albert, which is nowadays often used in debates about the meaning of quantum theory (usually people just link to the video without discussing the argument(s) much further).

But, as we shall see, it does not really work the way people think it does.

Consider a quantum system (e.g. electron in a Stern-Gerlach apparatus) which evolves from an initial state |i> into a superposition |a> + |b> (*). An observer (named Sidney) makes a measurement and evolves from an initial state |I> into a superposition |A> + |B>. How can we reconcile this superposition of observer states with our everyday normal conscious experience?

Well, consider all the states |Sj> of 'Sidney with a normal conscious experience', where j supposedly labels all the different conscious experiences Sidney can have. All those states |Sj> are eigenstates of the 'consciousness operator' C so that

C|Sj> = |Sj>.

It is clear that |A>, which here means 'Sidney observed |a>', is an eigenstate of C and also |B> is an eigenstate of C,

C|A> = |A> and C|B> = |B>.

But it follows immediately that |A> + |B> is then also an eigenstate of C,

C( |A> + |B> ) = ( |A> + |B> ), from the linearity of the quantum operator C.

This would mean that the superposition of Sidney states does not really cause a problem; The superposition still corresponds to a 'normal conscious experience'.

So it seems that there is no 'measurement problem' as long as we stay fully within quantum theory. Therefore this argument is very popular with people who want to

*end*the interpretation debate, while I think it may be a good starting point to

*begin*a serious discussion.

In order to see that something is wrong with the Coleman argument, consider that there must be states |Uj> of Sidney where he does not have a 'normal conscious experience', eg. because he is asleep, drunk or worse.

Obviously |Uj> cannot be an eigenstate of C, so that C|Uj> = 0.

The problem is that we have to assume that the initial state of Sidney |I> does not just evolve into the superposition |A> + |B> but it will also contain some components |Uj>, because even for a healthy person there is a small but non-zero possibility to fall into an unconscious state. But as one can check easily, the state

|A> + B|> + |Uj> is certainly

*not*an eigenstate of C and due to the superposition of states Sidney will in general never be in such an eigenstate of C. This would mean that Sidney never has a 'normal conscious experience'.

Obviously, there must be a problem somewhere with this whole argument and I leave it as an exercise for the reader to find it and post it as a comment 8-)

(*) I leave out the necessary normalization factors of 1/sqrt(2) etc. in this text and the reader has to assume that these factors are absorbed in the state vectors. E.g. in the state |A> + |B> + |U> we assume that |U> comes with a very small weight/probability, but I do not make this small pre-factor explicit.

Subscribe to:
Posts (Atom)