Information has come to play an increasingly fundamental role in our lives during the last few decades: billions of computers are now interconnected over the world, and our technology – and hence our survival and well-being – crucially rely on them.
It is much harder to argue that information plays a role in fundamental physics.
Traditionally, fundamental physics expresses predictions about where, say, a particle will go, given its initial state and its laws of motion. This paradigm has been the prevailing one since Galileo and Newton and has been extremely successful – allowing us to formulate deeper and deeper explanations of the physical world, of which quantum theory and general relativity are the current best examples. Yet, there are things in the physical world that this mode of explanation cannot adequately capture for us. Information is one such thing.
For a start, that information is an element of the physical world is itself rather hard to grasp and counterintuitive! This is because information does not look like the usual objects that physics uses to explain reality. Information is not an observable, such as the velocity of a particle; nor does it depend on all the details of the physical system that happens to instantiate it: one bit of information can be instantiated equally well by systems with very different properties (e.g., the transistors of a computer, the flags of an air-traffic controller and the neurons in the brain). Indeed, one key property of information is that it can be copied from one such system to any other, irrespective of their details, still retaining all of its properties qua information. This is a counterfactual property – about some transformation (i.e. , copying) being possible.
All these facts about information suggest that it be an abstraction. Indeed it was thought for long that it be a-priori, just like, say, natural numbers are.
In contrast, we also know that information can only exist if it is physically instantiated; that it can be processed by physical systems – such as computers, ribosomes and brains. We can quantify how much information a particular physical system has, using the classical Shannon measure. Also, with the advent of quantum computation, we have come to know that different laws of physics make different kinds of information processing possible: a quantum computer can access modes of computation so efficient that no classical computer that could be built physically could match them, and modes of communication more secure than those available to any classical system.
So, we have many clues that information must be an attribute of the physical world. But how can one express that fact within fundamental physics, if there one only finds space for predictions about the trajectory of particles through space-time?
Constructor theory offers an elegant, powerful new way of doing so. It is a new mode of explanation in which one can reformulate the whole of physics in radically new terms. Instead of expressing everything in terms of predictions about where a particle will go – as one would do in the prevailing conception of physics - one expresses everything in terms of which physical transformations (or tasks) are possible, which are impossible and why – given the other laws of physics.
Join the conversation