Can urbanism be explained from entropy? Urban sciences try to understand the processes, forces and phenomena that operate in the formation, creation and decline of cities. And also in how they function. They are sciences whose progress is nourished by progress in different fields. Biology, politics, economics, sociology, and mathematics are just some of the disciplines that contribute to our understanding of the city.
But what about physics? Intuitively, we are inclined to think that nothing falls outside the laws of physics, and urbanism should not be an exception. As we explained in the first article of this series, some scientists think that the fundamental law of the Universe is the Second Law of Thermodynamics, the one that says that the entropy of the Universe always increases. As we will see below, we are not the first to try to find a relationship between entropy and urbanism.
The entropic metaphor of urbanism
This article on “Entropic Urbanism” by Ignacio Grávalos and Patrizia Di Monte interestingly cites Haussmanian urbanism as a metaphor for the entropic city and, by extension, the modern movement, whose most uniform product is suburbia. According to both architects, the entropic city is the one that tends towards uniformity, where nothing happens and there is no room for surprise. How not to remember, that wonderful song by David Byrne: “Heaven.”
“(…) Heaven is a place, where nothing ever happens. The band in Heaven, they play my favorite song. They play it once again, they play it all night long”
Byrne doesn’t say so, but it is intuited that the favorite song listened to on a loop can become abhorrent. From the entropic, perfect city, from the suburbs, where every day is happily the same, to dystopia, there is only one step. Remember “The Truman Show.”
On the contrary, it is the small daily revolutions that make the particles (the inhabitants of the city) defy the entropic destiny. The two urban planners who wrote “Esto no es un solar” know this well, which has turned many vacant spaces, where nothing ever happened, into places where things happen.
Beyond the metaphors, the truth is that there are no entropic cities and cities that are not. Every city is entropic, if by that we mean that it is subject to the dictates of entropy. And it is because everything that exists in the universe is: the second law of thermodynamics is incontestable, and nothing and no one can escape it.
The probabilistic approach of a geographer like Michael Batty
The entropy of a system has a direct relationship with the information that it carries. When the entropy or disorder is at a maximum, then all the states of the system are equally probable. This is well understood with the example of a message. If we generate a message randomly, we won’t be able to get any information from reading it. While if we write it according to a certain order or rules (for example, one that says that if we write pen we mean pen, and if we write broken we mean it is broken), from its reading we extract abundant information: in this case, that the pen is broken.
Let’s talk about a physical system, such as the particles of a gas in a room. In such a system, the maximum entropy is achieved when the temperature of the room has equaled that of the outside and the flow of energy and heat between the two is zero. In that case, the gas will be evenly distributed throughout the room, and there will be no drafts because there is no source of temperature to heat the air and make it rise. Under these conditions, a given gas particle is just as likely to be in a corner as it is in the center of the room, and therefore all states (a state is a possible distribution of particles) are equally likely. Being equally probable, knowing the position of all the gas particles (our message in the previous example), does not provide any information.
In his analysis of the relationship between urbanism and entropy, Michael Batty applies this probabilistic approach to the distribution of people in the city. The probability that there is someone at a point i at a distance from the center Di is, according to the British geographer, Pi = K * exp (-lambda * Di). As can be seen, in the center of the city D=0 and then Pi = K. At a great distance, Pi = 0.
Batty’s entropy is a spatial entropy, and measures the degree of heterogeneity of the population distribution in urban space. Batty argues that the entropy of a city (or its information) gives an idea of its complexity. When a city grows and develops new networks and new centers and points of interest (political, cultural, economic, etc.) it becomes increasingly complex and unpredictable, decreasing its entropy.
Urban planning as a process of transforming energy flows into structures
How does a city get “ordered” (decrease its entropy) and increase its complexity? In the same way that a system of dunes, a plant, or any living being does, thanks to the contribution of energy and matter. Let us remember that a city is not an isolated system and, therefore, it can circumvent the Second Law of Thermodynamics momentarily in exchange for a cyclical contribution of energy, as a refrigerator does (in both cases this decrease in entropy within each system separately leads to a global increase in entropy in the Universe, as the Second Law of Thermodynamics rigidly establishes).
These energy or material flows (after all, matter is just one more form in which energy is presented) change the configuration of the city, and the set of processes that deal with the use of that energy to the formation of a city structure is what we call urbanism.
Obviously, we do not need urban planners for this reconfiguration of urban matter as a result of energy contributions to occur. Unplanned settlements: refugee camps, suburbs, or favelas are examples of this. In these cases, without the intervention of urban planning, the city organizes itself from the bottom up. Not surprisingly, when we let the city grow without the intervention of a planner, the structures that emerge are often fractal in nature (that is, similar on various scales). Just like in nature.
Fractal configurations occur, among other things, because they optimize the obtention of nutrients and energy and their transport through a certain structure. Chris Alexander has already warned, however, that the fact that a tree is a clear example of a fractal structure should not lead us to tree-based urban planning. Since, at least, 1960 we know that the city is best explained as an ecosystem, and not as an organism.
Time, entropy and complexity
The second Law of Thermodynamics is so inexorable that many scientists (such as the great Eddington) argue that it should be the first. In fact, it is this law that marks the arrow of time: time is measured in the direction that entropy advances.
Therefore, time and entropy are directly related. But how does the complexity of systems within our Universe change as they both advance? Systems, be it a plant, a house, or the entire Universe, start with low complexity: a seed, a brick, or the big bang. As time passes, systems acquire complexity until they reach, at some point, their zenith. From there, the plant dies, the house is abandoned to its collapse, and the universe will end up becoming a cloud of scattered particles.
The same thing happens to cities, they start small, and grow to reach their zenith as urban planning converts their energy inputs (raw materials, fuel, talent, information, etc.) into an ordered structure (houses, streets, universities , factories, etc.) But sooner or later, decline comes, a phase in which its complexity decreases as its disorder increases. This is, for example, the case of Detroit and other cities in the US Rust Belt.
In the next article we will talk about what we, designers and city dwellers, can do in all this framework. For now, we leave the matter of urbanism and entropy here.
Article published under Creative Commons free culture license. Some rights reserved.
Photo by Casey Connell via Unsplash