Relational Information: Towards a New Type of Information in Quantum Mechanics
This question is especially pressing in quantum mechanics, where information is routinely quantified by von Neumann entropy and appears both as the quantum analogue of Shannon entropy and as a measure of entanglement correlations.
Shannon's theory of communication (Shannon, 1948) provides the classical paradigm: information is defined in terms of the compressibility of messages drawn from probabilistic ensembles, and it presupposes a sender, a receiver, and a reproducible message. In this framework, entropy measures a structural property of the source: the resources needed to faithfully encode and transmit its outputs. While powerful, this account remains silent about the nature of information itself. Timpson (2004) builds on this silence to develop a deflationary analysis of information theory.
The von Neumann entropy (von Neumann and Beyer, 2018), however, reveals a deeper tension. For global mixed states, it behaves just like Shannon's entropy: it characterizes a quantum source and underpins Schumacher's coding theorem (Schumacher, 1995). Yet, when applied to reduced subsystems of entangled states, the very same quantity plays a different role. It no longer measures ignorance about a source, but instead quantifies correlations that cannot be attributed to either subsystem alone (Popescu and Rohrlich, 1997). These correlations are invariant across observers and reflect objective relational constraints, not epistemic uncertainty.
This dual role suggests that quantum information is not a single unified concept. Alongside epistemic information, grounded in communication, there exists a distinct form of information grounded in structural interdependence. We call this relational information. Unlike Shannon information, relational information expresses physically instantiated connections between subsystems in entangled states. Recognizing this distinction enables us to challenge Timpson's influential thesis that classical and quantum information exemplify the same informational type (Timpson, 2016). By revisiting his type/token framework, we argue that entanglement entropy constitutes a token of a new informational type, irreducible to Shannon's. This distinction clarifies why von Neumann entropy plays two otherwise incompatible roles: as a coding measure for sources and as a correlation measure for subsystems. Our proposal is not merely terminological. It addresses a persistent ambiguity in the foundations of quantum theory and in broader appeals to "information" in physics (Landauer, 1996; Penrose, 1998; Vedral, 2010). The central claim is that relational information is ontological: it concerns not what agents know or transmit, but how physical systems are related. This invites a reconceptualization of information as part of the relational structure of the quantum world.
