Information has come to play an increasingly fundamental role in our lives during the last few decades: billions of computers are now interconnected over the world, and our technology – and hence our survival and well-being – crucially rely on them.
It is much harder to argue that information plays a role in fundamental physics.
Traditionally, fundamental physics expresses predictions about where, say, a particle will go, given its initial state and its laws of motion. This paradigm has been the prevailing one since Galileo and Newton and has been extremely successful – allowing us to formulate deeper and deeper explanations of the physical world, of which quantum theory and general relativity are the current best examples. Yet, there are things in the physical world that this mode of explanation cannot adequately capture for us. Information is one such thing.
For a start, that information is an element of the physical world is itself rather hard to grasp and counterintuitive! This is because information does not look like the usual objects that physics uses to explain reality. Information is not an observable, such as the velocity of a particle; nor does it depend on all the details of the physical system that happens to instantiate it: one bit of information can be instantiated equally well by systems with very different properties (e.g., the transistors of a computer, the flags of an air-traffic controller and the neurons in the brain). Indeed, one key property of information is that it can be copied from one such system to any other, irrespective of their details, still retaining all of its properties qua information. This is a counterfactual property – about some transformation (i.e. , copying) being possible.
Join the conversation