Information as constraint of behavior
The mechanism by which institutions act is through constraining the behavior of individuals. This can be in a discrete way such as disallowing a behavior, or it can be in a continuous way, by making certain behaviors more expensive or harder to reach in other ways. While the former are more obvious to us, the latter are perhaps the more important mechanisms for keeping our society going.
But how does an institution use this in order to get work done? Constraint of behavior is a reduction of the probabilities of certain behaviors in certain circumstances. For a large complicated system like a human, it’s difficult to illustrate the entire space of behaviors, especially considering that most behaviors are combinations of others. We would need a way to encode each behavior with a number and ways to translate among those numbers, and the number of them would be very large.
However if we imagine a simpler organism we can see how constraint of its behavior constitutes information. Let’s consider an organism with 1 degree of freedom both for sensation and action. It has a sensor which accepts an ion from the environment and causes it to open its membrane. For a given concentration of these ions, there is a certain probability that the sensor is triggered. When it opens its membrane, it allows its concentration to become that of the surroundings. Let’s call it a “sentibrane”, for “sense + membrane”, just for ease of reference. So we have 2 sensory states: present/absent and 2 behavioral states: opened/closed.
Now let’s imagine that in a given moment, the following happens:
a new concentration is established
the brane registers an ion on its sensor or not
if it registers an ion, it opens and syncs its concentration with the environment’s
We can see that we can created an area which will maintain a higher ion concentration than its surroundings. This is negentropy, or information. The sentibrane created information about its environment via signal through a noisy channel. It can now probabilistically rely on the fact that its internal concentration is higher than the external concentration of these ions. It can do this by using the difference in potential or osmotic pressure to store potential energy, and then release upon its sensor getting triggered. In order to do this, it must let the ions flow from the inside to outside the membrane, reducing the potential energy stored in this way.
This is how information can be created and used to extract energy. Information is in a sense potential energy because any potential difference can be theoretically used to do work. Information can only exist to an entity which can do work with that potential difference. We can say that the sentibrane has information about the environment because it relies on the difference in probability to do work, and behaves based on this. In doing such work, it *consumes* the information. The potential difference is reduced as it is used to do work on whatever mechanism the sentibrane uses to trigger its actuator, as well as preparing the sensor to be tripped by another ion.
We can say that this information is flowing from the environment to the sentibrane in this way. There is a channel capacity which we can compute. We can also calculate the amount of work that the sentibrane can do as a result of that information. We should be able to compare the entropy of the concentration probability distribution of the environment to the probability distribution within the membrane and use this to understand the information gained as well.
The umwelt, or perceptible world, of this sensibrane is limited to a single sensor with a single bit of capacity, as well as opening or closing its membrane. While it doesn’t appear that the sensibrane is altering its environement, it actually is. The area within the membrane is part of its environment. When a crab digs a hole, we see it as altering its external environment in a way that we wouldn’t attribute to the sensibrane, but it is not ultimately different. The crab digging a hole is performing a transformation on the probability distribution of its environment, just like the sensibrane is.
When the sensibrane opens or closes its membrane, it sends a signal back to the environment, which theoretically should be able to be used to create information. But in our example, the environment is not doing anything with that signal. There is no significant difference in potential energy based on the actions of the sentibrane, so there is no information created from it, despite the signal being sent. However as the constructor of this hypothetical scenario *we* could arrange the environment to convert that signal to information, we can also constrain its behavior. Let’s say we create a solution of concentration 10, we can guarantee the sentibrane will open. We could create some mechanism which converts the mechanical energy from it opening into some potential energy. By changing concentrations back and forth we can extract energy from the sentibranes. In order to do so, we must position them in a certain way such that their actuation will properly drive whatever mechanism we’ve derived. This positioning is a critical part of the creation and maintenance of that information.
We can predict the probability of the sentibrane being in the open state based on what concentration we put it in. If we need to constrain that probability to 1, we simply make sure it’s in a concentration of 10. It will take more energy for us to create a solution of that concentration. We could also shift between concentrations of 3 and 7 and extract work less reliably.
As we can see, one of the basic processes of life is on some level equivalent to transforming a probability distribution and then using that transformed probability distribution to power another behavior which in turn also transforms a probability distribution. An aspect which was glossed over in this example which is critical for life is a closing of the loop. The sensor is the channel through which this information flows and and it requires energy to maintain, meaning it requires constraint on a probability distribution.
Institutions are a part of this process. They receive information from their environment and in turn constrain our behavior. We are part of the mechanism by which maintain their information channels and actuate to do work on their environment, external and internal, including us. By constraining our behavior, they are able to create information, and thus are able to do work. Exactly how they constrain our behavior and what work they are doing is a result of how we design those institutions.