Program

Conference layout

  1. 14:00 – Introduction
  2. 14:20 – Valia Allori: Who should and should not care about the Schrodinger cat problem
  3. 15:00 – Angelo Bassi
  4. 15:40 – Coffee break
  5. 16:10 – Dirk A. Deckert: Theories on nature, the quantum regime and the empirical import
  6. 16:50 – Matteo Carlesso: Collapse models as a solution to the measurement problem
  7. 17:30 – Dustin Lazarovici: What’s wrong with many-worlds?
  8. 18:10 – Discussion and wrap-up

Valia Allori

Keywords: Scientific Realism, measurement problem, Ontology

I took few notes as this was more of a philosophical discussion, rather than the usual note-taking class

Angelo Bassi

Postulates summary:

  1. States
  2. Dynamics Schrodinger Equation
  3. Quantum observable Self-Adjoint Operator, where the associated eigenvalues are the possible outcomes of a Measurement.
  4. Measurement wave function collapse via with probability . It is unclear what is / isn’t a Measurement

The Quantum observable postulate can be derived

We start our setup by stating that:

  1. The outcomes map to .
  2. Measurements modify the state of a system

Measurements select states, i.e.

  • They cause the change of the system.
  • This is confirmed by consecutive Measurements returning the same value.
  • The states are orthogonal. We assume this, as it is independent of the other 3 postulates.

We are able to reconstruct a Measurement as an Operator . This Operator is a Self-Adjoint Operator by the nature of it’s construction. Hence we can derive postulate 3 above from

Dirk A. Deckert

  • Ontology: Objects > Relations > Change. This is a dynamical structure.
  • Desired qualities
    • Predictive power
    • Empirical adequacy

In the Quantum regime, we describe our time evolution through Unitary Operators that allow us to map the time evolution of Quantum states in a Quantum Hilbert Space .

Availability corresponds to the decomposition of the Tensor product for time evolution between the objects we care about and the rest of the world. More specifically, , where we have the bject relations we are interested in, and are the relations happening in the est of the world. Then .

We then can have:

  1. The evolution of is non-linear Copenhagen interpretation
  2. does not capture all the information needed Bohmian mechanics

The game is to have Born’s rule in the form of a theorem.

Bohmian mechanics

We explain the world as .

  • I asked Dirk A. Deckert for the lecture notes here, as they are very clear and comprehensive. Sorry for not taking further notes. I should organise the slides shared into a coherent set of notes.

Matteo Carlesso

Linear term

We introduce a collapse term:

where

The above is not unitary, so it does not work.

Stochastic Linear term

We can also add a Stochastic Linear term:

where

If we neglect the Hamiltonian, then the time evolution preserves the populations, not giving us a collapse term. Hence this not work

Deterministic non-linear dynamics

Non linear terms are necessary, since the above options do not suffice to explain collapses, however non-linear dynamics for allows for faster than light signaling.

Hence we need stochastic non-linear modifications.

Stochastic non-linear dynamics

See GRW Theory, a type of Continuous Spontaneous Localization model

We build localisations that map wave functions to a localised version . In between localisations, the Wave function evolves according to the Schrodinger Equation.

The theory is characterised by a collapse rate , which describes localisations, by stating that the time between consecutive localisations follows a Poisson distribution.

Experimental results

This section describe experiments we can use to test the validity of the GRW Theory. We can look into the destruction of quantum superposition via interferometric experiments to see the behavior.

A key aspect of the experiments is to measure experimentally the entanglement times, and compare those with the predictions

  • I want to look into the data and derivations used for this. It feels like a relatively standard application of the Maximum Likelihood Estimation framework, however using an underlying model which is not clear to me: the Poisson distribution for the times is clear, but a measure requires choosing a Measurement time a priori, which means that our observations are not the time of the collapse, but instead a check for a set of chosen previously. It would be cool to optimize the choice of via Fisher Information to maximize the information we get about the underlying parameter .

Dustin Lazarovici

The measurement problem states that one of the following is false:

  1. The Wave function is a complete description of reality
  2. The Wave function evolves according to the Schrodinger Equation
  3. Measurements have definite outcomes.

I took few notes as this was more of a philosophical discussion, rather than the usual note-taking class