Want to take part in these discussions? Sign in if you have an account, or apply for one below
Vanilla 1.1.10 is a product of Lussumo. More Information: Documentation, Community Support.
created a currently fairly empty entry quantum measurement, just so as to have a place where to give a commented pointer to the article
It does not resolve yet, though the idea might be interesting for the latter. The measurement is not to do anything a limit from quantum to classical. First of all, the measurement is TIME DEPENDENT process at finite, constant but small Planck constant, possibly smaller than the difference between two neighboring states. Also, the measurement does not need to be around the ground state. In time dependent process once macroscopic details resolve the difference between neighboring energy levels one of the quasieigenstates becomes slightly more eigenstate in the moment of measurement (which is still large in comparison to single cycle) so it carries away most of the amplitude, while there are cancelations in the other state. It is known for decades that this can be achieved with external stochastic perturbation. Of course, no perfect “reduction” is possible. But it is not needed either, life goes on and we are never in perfect eigenstates anyway, and residual wavefunction gets input into future processes and so on. The missing part is however that nobody reproduced for a general measurement time evolution model of that kind the Born rule.
The measurement is not to do anything a limit from quantum to classical.
That’s maybe your postulate. His is the opposite, from which he proceeds. I can see why his postulate make sense, though I could also see why others may make sense.
Part of the problem is that there is no consensus on what the formal problem to be called “the measurement problem” is, I suppose. But it seems unlikely to me that you can formulate anything sensible without some recourse to classical or thermodynamic limits, even to make sense of a notion of “measurement” in the first place.
Measurement is happening in real time isn’t it ? You need a model of reality, measurement included, which is in agreement with reality. (E.g. Planck constant is constant for example and does not change.) So you need to see how in real time, a macroscopic distribution of approximate eigenstates in a real process chooses a single such “eigenstate” after a portion of time. Quantum theory of measurement once created should produce Born rule out of consideration of such imperfect macroscopic system (macroscopic should be defined carefully, but it certainly includes that it s so big that the eigenlevels are so close that small perturbation can mix them). To define measurement you just prepare such a macroscopic system in which the choice between different allowed states is not stable in the degenerate assembly. No experiment can distinguish between the “reduction” and the process in which 99.999% of the wavefunction is in the state which is close to true eigenstate in that microtuned period than the others. Once is there, you can push the consequence of “observing” by channeling the chain of events further and so on. Of course, Copenhagen people do not accept that as they believe in absolute eigenstates with infinite period of decay and so on, while everything in a big system s metastable and has some frequency after which next cycle may have a phase of -180 degrees. I believe for 25 years or so that it is possible to make a time independent simulation getting there. I will be delighted when somebody does it careful enough to get the Born rule out of time dependent general model of this type. I can not appreciate syncretism of asking the modes which do not exist in nature like Planck constant going to true zero and pretending to have the model of measurement without time included in the picture (and seeing WHEN quasireduction (if you wish, emerging reduction) happens, and if it happens gradually or discretely, I believe in the first).
That’s maybe your postulate.
No, am not putting any postulate, specially not the classical nor thermodynamical limits. Schroedinger equation and macroscopic universe (I gave some hints what kind of features on the prepared quantum system I would put to call it macroscopic with respect to certain time dependent quantum mechanical process of approaching to a choice between the approximate eigenstates), nothing other than this minimum. Reduction just a consequence of macroscopic device, not in Copenhagen sense, but modelled as part of the universe modelled with Schroedinger equation with plus some openess of the system and regime of sensitive mixing between quasiegenstates. Finally, if true quantum mechanics with emergent reduction rules (what is very appealing and I am an enthusiast for it), I do not think that you need to have any assumption other than that the universe is macroscopic (hence it is globally not in perfectly symmetric state, from the origin of time, no global degeneration). Isolating some macroscopic part and calling it measurement device, certainly does not produce a perfect closed system without perturbation and with absolute eigenstates. That should be enough morally.
You see, the basic requirement is just the consistency of QM description of the tme evolution of the world. You describe it by Schroedinger equation etc. Now one has those “reductions”. You can now take two stands. Either you believe that the description by Sch. eq s incomplete and that something else happens “at reduction time”. Another is to say that if you include the measurement device into a bigger system in which it is just a regular Qm system, then QM should produce the same result wthout an assumption of reducton there, but possibly requiring some reduction when looking at the bigger system from external point of view. Copenhagen interpretation says that nothing else then what is said from the point of view at some level of mesaurement with reduction can be said, so it is irrelevant to go into more details of the process than can be produced by modelling mini measurement from the point of view of bigger system measurement, within which the smaller is just a QM system without observable dscontinuities. Hence, the very Copenhagen interpretation says that there ought to be a QM model of measurement in the mini system, and no difference between two measurement models which is not observable from the point if view of external bigger system is in fact physical. So, the whole purpose is to model it, using QM, with fixed PLanck constant, wthin a bigger system with Copenhagen interpretation of the bigger system and time dependent and with some level of openess and closeness of quasiegenstate levels. Of course, a posteriori, if we had such a theory, we can try to accomodate some other interpretation than Copenhagen, but I think the search for measurement theory can be accomodated as a reasonably well defined task for a strict believer in Copenhagen interpretation (in whch classical is about the semantics of observation, not about Planck constant going to zero! though it does not matter as I say, as you have a system within a system for purposes of modelling with external reality being Copenhagen and internal QM wthout ad hoc reduction). This all looks forced to me by usual QM with Copenhagen interpretation, while of course solving the problem is real hard, but the solution should exist.
For me the following slogan holds: a quantum theory of measurement is a quantum mechanical model (assume standard mechanics) of a quantum process with emerging reduction phenomenon. Ideally, one would like to have an emerging model for any QM reduction process, but to start with it would be nice to have it for one. I think some general features of would be theory could be predicted from consistency checks as I alluded above, but this is still too little.
a quantum process with emerging reduction phenomenon.
That’s incidentally exactly what Landsman argued for: an emergent phenomenon. Pure QM without any limits can’t give it. The relation to spontaneous symmetry breaking that he made is rather neat, I think.
Pure QM without any limits can’t give it.
Can not give it for reduction to true eigenstates, I agree. But the macroscopic experiment does not work with infinite duration true eigenstates. For quasiegenstates which are very close in time dependent process you might have evolutions determined by random perturbations (and alike macroscopic features) which lead to effective reduction. All the arguments against existance of a theory of measurement are for true eigenstates and do not work for processes alike what I describe. Plus, as I say, you can have a measurement subsystem which s QM-echanical and which is embedded into a bigger system with Copenhagen reduction (which you in that case do not explain), but for the consistency you do have the task of deriving the measurement theory for the subsystem within QM within Copenhagen bigger system. For the consistency of QM with Copenhagen intrepretation this is not only possible, this is necessary to exist. I am sorry for those experts in foundations who gave up something what logically the theory requires.
Landsman does not describe the time-dependent process. He does not say where the Schroedinger equation with finite Plancks constant stops and the zero (or infinitesimal) turns in. This is the same problem as the Copenhagen interpretation had: where and how the reduction happens, for which subsystem etc. He puts a limit somewhere at unpredicted place in time and space, as the reduction by Born rule is also just postulated. Which system would do the limit ? He puts even microscopic system to do the limit. But in real time microscopic system will not do the reduction. True, he can say it happens when the macromeasurement is happening. But calling it limit or calling it happening of Born rule is a about the same level of hand thrown device without the prediction to which system will have it. So it is not a theory of measurement.
The relation to spontaneous symmetry breaking
It is of course the default system for reduction to have a degeneracy with different choice for reduction on equal footing. Almost all attempts to models of QM reduction I was thinking about since my hi school were of symmetry breaking reduction. I can hardly imagine if one wants to reduce the features of the system to the minimum not to have a model for symmetry breaking. But this symmetry breaking must happen as a real time-dependent process, starting with a wave packet and ending with most of the amplitude in a quasieigenstate. Unfortunately such a complete description is entirely missing in Landsman’s work.
What Landsman had it right is the role of off diagonal perturbation distinguishing the true eigenstates from quasieigenstates. And he done it in more detail and more axiomatic framework than any similar analysis which I was aware before. But this was known in less rigorous analysis of thought experiments much earlier, though most authors whom I read quotations from, referred to the need of “stochastic perturbation”, though I think no incoherence nor fundamental statistical behaviour in this perturbation may be really necessary, but unless somebody does the theory right, one may think that it is really needed for a complete theory.
I put in a half-assed idea section at measurement (without the quantum), which you may proceed to tear apart (or even edit).
I put one remarkable recent (series of) references into entanglement (added also redirect quantum entanglement); the references are due A. P. Balachandran et al. Entaglement is also about the division of the system into to parts as your idea of measurement. The novel feature of the work is to include the consideration of identical particles into the study of quantum entaglement. The work is in operator algebraic framework, based on usage of GNS construction and the consideration of von Neumann entropy.
I have (here):
expanded on what the projection postulate of quantum measurement says, and added reference to the original proposals by von Neumann and Lüders;
added discussion of two choices for typing quantum measurement,
with pointers to how the Quipper community seems to think about this;
added a comment that both choices emerge and are unified in the Quantum Modal Logic inherent to dependent linear type theory.
I have added pointer to:
Bob Coecke, Duško Pavlović, Quantum measurements without sums, in Louis Kauffman, Samuel Lomonaco (eds.), Mathematics of Quantum Computation and Quantum Technology, Taylor & Francis (2008) 559-596 [arXiv:quant-ph/0608035, doi:10.1201/9781584889007]
Bob Coecke, Duško Pavlović, Jamie Vicary, A new description of orthogonal bases, Mathematical Structures in Computer Science 23 3 (2012) 555- 567 [arXiv:0810.0812, doi:10.1017/S0960129512000047]
also pointer to:
added pointer also to
Bob Coecke, Eric Oliver Paquette, POVMs and Naimark’s theorem without sums, Electronic Notes in Theoretical Computer Science 210 (2008) 15-31 [arXiv:quant-ph/0608072, doi:10.1016/j.entcs.2008.04.015]
Bob Coecke, Eric Paquette, Dusko Pavlovic, Classical and quantum structures (2008) [pdf]
and linked with quantum reader monad, which page I will create now
added pointer to today’s
added this pointer to the list of references on the “classical structures” Frobenius monad:
added pointer to yetserday’s
added the precise pointers (as far as I can spot) where Coecke’s “classical structures” evolved into the “spider”-component of the ZX-calculus:
Bob Coecke, Ross Duncan, §3 in: Interacting Quantum Observables, in Automata, Languages and Programming. ICALP 2008, Lecture Notes in Computer Science 5126, Springer (2008) [doi:10.1007/978-3-540-70583-3_25]
Aleks Kissinger, §§2 in: Graph Rewrite Systems for Classical Structures in -Symmetric Monoidal Categories, MSc thesis [pdf, Kissinger-CLassicalStructures.pdf:file]
Aleks Kissinger, §4 in: Exploring a Quantum Theory with Graph Rewriting and Computer Algebra, in: Intelligent Computer Mathematics. CICM 2009, Lecture Notes in Computer Science 5625 (2009) 90-105 [doi:10.1007/978-3-642-02614-0_12]
Bob Coecke, Ross Duncan, Def. 6.4 in: Interacting Quantum Observables: Categorical Algebra and Diagrammatics, New J. Phys. 13 (2011) 043016 [arXiv:0906.4725, doi:10.1088/1367-2630/13/4/043016]
Making the same edit also at quantum reader monad
and pointer to:
added pointer to:
With the new linear modal way of seeing things, is there anything to say about the quantum Zeno effect. What capacity is there to speak of duration/speed of measurement/preparation in linear HoTT?
Haven’t thought about formalizing the quantum Zeno effect yet, but there is no fundamental ingredient at play outside of what we already have.
Right now I am busy with writing out something more profound, namely the encoding of Hermitian inner product structure — hence of the probabilistic aspect of quantum physics — as a self-duality condition in “Real vector spaces” in the sense of Atiyah (namely in -equivariant vector real modules over regarded as a monoid in -equivariant real vector spaces via complex conjugation).
This is getting at some deep aspect of what quantum is all about, I believe. I had indicated the basic idea at Hermitian form – As equivariant modules, but now to flesh this out further.
added pointer to:
Jonathan F. Schonfield: The First Droplet in a Cloud Chamber Track, Found Phys 51 47 (2021) [doi:10.1007/s10701-021-00452-x]
Jonathan F. Schonfield: Measured distribution of cloud chamber tracks from radioactive decay: A new empirical approach to investigating the quantum measurement problem, Open Physics 20 1 (2022) [doi:10.1515/phys-2022-0009]
1 to 25 of 25