Quantum theory and classical probability theory are special cases of generalized probabilistic theories (GPTs), i.e. **conceivable statistical theories that describe the probabilities and correlations of physical events**. They are studied to improve our understanding of quantum theory -- for example, by reconstructing it -- and to explore alternative models of physics or computation. Our group has contributed many fundamental insights into the structure of GPTs and their relation to physics.

### What is a GPT?

GPTs are rigorous mathematical theories that generalize both classical and quantum physics (in particular, their statistical predictions). The GPT framework is based on absolutely minimal assumptions — essentially, it contains only structural elements that represent self-evident features of general laboratory situations. It admits a large class of theories, with QT as just one possible theory among many others.

The paradigmatic laboratory (but in no way the only) situation considered in the GPT framework is the one sketched in the figure below: the *preparation* of a physical system is followed by a *transformation* and, finally, by a *measurement *that yields one of several possible outcomes with some well-defined probability.

The results of the preparation procedure are described by **states**, and the set of all possible states in which a given system can be prepared is its **state space**. *Every* possible state space defines a GPT system, up to a single constraint: we would like to be able to "toss a coin" and prepare one of two given states at random, with some probability. This introduces a notion of *affine-linear combinations* on the state space, which (together with a notion of normalization) implies that state spaces are **convex subsets** of some vector space over the real numbers.

**Transformations** map states to states, and they must be consistent with the preparation of statistical mixtures, i.e. they must be linear maps. Outcome probabilities are described by **linear functionals ("effects")** on the space of states. And this is essentially all that is assumed.

Two special cases are of particular importance:

**Quantum theory (QT)**. Systems are characterized by an integer*n*(the maximal number of perfectly distinguishable states), and the states are the (*n*×*n*)*density matrices*. The transformations are the*completely positive, trace-preserving maps*, and the effects are given by positive semidefinite operators with eigenvalues between 0 and 1 (*POVM elements*).

Among the transformations, the*reversible transformations*(those that can be undone) are the*unitary*maps.**Classical probability theory (CPT)**. For given*n*, the state are the*n*-outcome*probability vectors*. The transformations are the channels, i.e.*stochastic matrices*, and effects are given by non-negative vectors. The reversible transformations are the*permutations*of the configurations.

In addition to QT and CPT, there is a **continuum of GPTs with different kinds of physical properties**. For example, a GPT called "boxworld" contains states that violate Bell inequalities by more than any quantum state. Other GPTs predict interference of "higher order" than QT, a prediction that can in principle be tested experimentally. Note that typical GPTs do not carry any kind of algebraic structure** **-- there is in general no notion of "multiplication of observables".

The "landscape" of GPTs provides a simple and extremely general framework in which QT can be situated. GPTs have a number of properties **in common** with QT:

- They satisfy a no-cloning theorem (yes, also classical probability theory does!), most GPTs have non-unique decompositions of mixed states into pure states, they have a notion of post-measurement states, and many of them allow for a notion of entanglement similar to QT.

But they also **differ** in many ways. Some examples have been given above; here are more:

- Most GPTs
**do not**have a notion of "orthogonal projection", and there is in general**no**correspondence between states and measurement results (i.e. states and effects are different sets of vectors); "energy" is typically a**superoperator**rather than an observable, and typically there are pure states that are not connected by reversible time evolution.

Given this extremely general framework of theories, one can then go ahead and **explore the logical architecture of QT (and our world)**: if a GPT has property A, does it also have to have property B? Which kind of GPTs fit into spacetime physics or thermodynamics as we know it? Could some GPTs even describe physics in some hitherto unknown regime?

A research project on its own is to write down a set of simple principles that "picks out" QT from the set of all GPTs. This is the program of reconstructing QT, and our group has made substantial contributions to this research program.

For more details (and many references which have been omitted above), feel free to have a look at my Les Houches lecture notes:

As a complementary resource, Martin Plávala has written an excellent set of lecture notes on GPTs that are available here.

### Example results

As some examples, our group has published the following results on GPTs:

**Bit symmetry implies self-duality.**Suppose that a GPT has the property that every pair of pure, perfectly distinguishable states can be reversibly mapped to any other ("bit symmetry") -- a property that holds for quantum theory and classical probability theory. Then it must be strongly self-dual, i.e. it must satisfy a particular form of state-observable correspondence.

M. P. Müller and C. Ududec,*Structure of Reversible Computation Determines the Self-Duality of Quantum Theory*, Phys. Rev. Lett.**109**, 090403 (2012). DOI:10.1103/PhysRevLett.108.130401**Decoupling and the Page curve beyond quantum theory**. We have generalized the Hayden-Preskill-calculation for "black holes as mirrors" to more general GPTs. As a result, the Page Curve becomes modified at large evaporation times.

M. P. Müller, J. Oppenheim, and O. C. O. Dahlsten,*The black hole information problem beyond quantum theory*, J. High Energy Phys.**09**, 116 (2012). DOI:10.1007/s13130-012-4801-4; arXiv:1206.5030.**Thermodynamics in general theories**. We have analyzed how thermodynamics constrains the probabilistic theory that describes Nature:

M. Krumm, H. Barnum, J. Barrett, and M. P. Müller,*Thermodynamics and the structure of quantum theory*, New J. Phys.**19**, 043025 (2017). DOI:10.1088/1367-2630/aa68ef; arXiv:1608.04461.**Beyond-quantum Darwinism.**Quantum Darwinism intends to explain the emergence of a classical world from quantum theory, and we have analyzed under what conditions this explanation is available in arbitrary GPTs.

R. D. Baldijão, M. Krumm, A. J. P. Garner, and M. P. Müller,*Quantum Darwinism and the spreading of classical information in non-classical theories*, Quantum**6**, 636 (2022). DOI:10.22331/q-2022-01-31-636; arXiv:2012.06559.

For more examples, please see my publications page.

## Müller Group

#### Markus Müller

**Group Leader**+43 (1) 51581 - 9530

#### Manuel Mekonnen

**PhD Student**