We continue by telling people that they can look into the history and learn more about the cartograph. History
Governing FeaturesMaybe a single sentence on the principles with a link to its page. Governing Feature
Urbanism Through the Lens of Complexitywe continue with the shortest intro to the urban fields. Urbanism
Quickly dive into complexity theory and how it can work with urbanism fields through search.
Relates to {{Assemblage-Geography}}
All points are interconnected and interdependent, unfolding in a nonlinear manner with no central source of authority.
Early versions of systems theory assumed that systems could be 'optimized' to a single condition. CAS analysis assumes that more than one system state can satisfy optimizing criteria, and so the system is able to gravitate to multiple equilibria.
This is relevant to the field of Relational Geography
An enslaved state can persist as an attractor (see Attractor States) within a Fitness Landscape.
Beyond its day-to-day usage, this term used in now employed in the social sciences to highlight the Path Dependency exhibited in many social systems. This is seen to contrast with prior conceptions like "the march of history", which imply a clear causal structure. By speaking about the work as something contingent, it also begs the question of what other "worlds" might have just as equally manifested, had things been slightly different.
Similar ideas are captured in the ideas of Non-Linearity, {{sensitivity-to-initial-conditions}}, History Matters.
Pictured below: the contingent trajectory of the double pendulum:
See also: Causal loop diagram - Wikipedia
In geography there has been a move away from thinking about space as a "thing" and to instead think about how different places exist due to how they interact with flows. Places that capture more flows, are more geographically relevant
The nature of a building block varies according to the system: it may take the form of an ant, a cell, a neuron or a building.
Complex Adaptive Systems theory provides a useful lens with which to understand various phenomena. Keep reading about Complexity
Well this is some nice and text to help us with whatever this should be. Keep reading about Urbanism
Urban FieldsWe continue by telling people that they can look into the history and learn more about the cartograph. People
TermsMaybe a single sentence on the principles with a link to its page. Terms
Navigating Complexity brings in a wealth of resources and related content associated to the topics and terms. You can see all of them sorted by type.
The site features a system for the submission and evaluation of explanatory diagrams relating to a variety of CAS topics.
Crowdsourcing Diagrams
A new way to explore the content in an interactive dashboard of all topics in this site.
Complexity Explorer
Navigating Complexity is a platform for learning about complex adaptive systems and how they apply to the built environment.
The AuthorSharon has been involved in complexity research for over 20 years, and is the developer of the overall website content and content structure. Learn more about Sharon
Site StewardsSpecific components of the site are generously managed by a Site Stewards, working to keep the content fresh and accurate. Site Stewards
In order to be responsive to a world consisting of different kinds of inputs, complex systems tune themselves to states holding just enough variety to be interesting (keeping responsive) and just enough homogeneity to remain organized (keeping stable). To understand how this works, we need to understand flows of information in complex systems, and what "information" means.
Complex Systems are ones that would appear to violate the second law of thermo-dynamics: that is to say, order manifests out of disorder. Another way to state this it that actions within the boundary of the systems are one where order (Negentropy), increases over time. This appears counter to the second law of thermodynamics which states that, left to its own devices, a system's disorder (entropy) will increase. Thus, we expect that, over time, buildings break down, and a stream of cream poured into a cup of coffee will dissipate. We don’t expect a building to rise from the dust, nor a creamy cup of coffee to partition itself into distinct layers of and cream and coffee.
Yet similar forms of unexpected order arise in complex systems. The reason this can occurs is that complex systems are not fully bounded - they are {{open-dissipative}} structures that are subject to some form of energy entering from the outside, and within these "loose" boundaries, we see glimpses of temporary order. Disorder is, however, still being ejected outside of these same boundaries - stuff comes in, stuff goes out - in some other form. It is only within the boundaries that we see temporary pockets of order. In order to get a better grasp on how these pockets of temporary order appear, we need to understand the relationship between entropy (disorder, or randomness) and information.
An important way of thinking about this increase in order relates to concepts based in information theory. Information theory, as developed by Claude Shannon, evaluates systems based upon the amount of information or 'bits' required to describe them.
Shannon might ask, what is the amount of information required to know where a specific molecule of cream is located in a cup of coffee? Further, in what kinds of situations would we require more or less information to specify a location?
Example:
In a mixed, creamy cup of coffee, any location is equally probable for any molecule of cream. We therefore have maximum uncertainty about location: the situation has high entropy, high uncertainty, and requires high information content to specify a location. By contrast, if the cream and coffee were to be separated (say in two equal layers with the cream at the top) we would now have a more limited range of locations where a particular bit of cream might be placed. Our degree of uncertainty about the cream's location has been reduced by half, since we now know that any bit of cream has to be located somewhere in the upper half of the cup - all locations at the bottom of the cup can be safely ignored.
Counterintuitively, the more Shannonian information required to describe a system, the less structured or "orderly" it appears to us. Thus, as a system become more differentiated and orderly - or as emergence features arise - its level of Shannon information diminishes.
This, in a way, is unfortunate: our colloquial understanding of 'having a lot of information', pertains to us knowing more about something. Thus, seeing a cup of coffee divided in cream and coffee layers, we perceive something with more structure, more logic, and we might assume this it should follow that this conveys more information to us (at least in our normal ways of thinking about information - in this case that coffee and cream are different things!). A second, stirred cup appears more homogenous – it has less structure or organization. And yet, it requires more Shannon information to describe it.
A difficulty thus lies in how we tend to intuitively consider the words ‘disorder’ and ‘information’. We associate disorder with lack of structure (or low amounts of information) and order with more knowledge and, therefore, more information).
While intuitively correct, unfortunately this is not how things works from the perspective of information and communication signals - which is what Shannon was concerned with when formulating his ideas. Shannon (who worked for Bell Laboratories) was trying to understand the required bits of information needed to relay the state of a system (or a signal).
Example:
Imagine I have an extremely messy dresser and I am looking for my favorite shirt. I open my dresser drawers and see a jumble of miscellaneous clothes: socks, shirts, shorts, underwear. I rifle through each drawer examining each item to see if it, indeed, is the shirt I am seeking. To find the shirt I want (which could be anywhere in the dresser), I require maximum information, since the dresser is in a state of maximum disorder.
Thankfully I spend the weekend sorting through my clothes. I divide the dresser by category, with separate socks, shirts, shorts, and underwear drawers. Now, if I wish to find my shirt, my uncertainty about its location has been reduced by one quarter (assuming four drawers in the dresser). To discover the shirt in the dresser's more ordered state requires less information: I can limit myself to looking in one drawer only.
Let us take the above example a little further:
Imagine that I love this particular shirt so much that I buy 100 copies of it, so many that they now fill my entire dresser. The following morning, upon waking, I don't even bother to turn on the lights. I reach into a drawer (any drawer will do), and pull out my favorite shirt!
My former, messy dresser had maximum disorder (high entropy), and required a maximum amount of Shannon Information ('bits' of information to find a particular shirt). By contrast, the dresser of identical shirts, has maximum order (negentropy), and requires a minimal amount of Shannon Information (bits) to find the desired shirt.
It should be noted that the two extreme states illustrated above are both pretty uninteresting. A fully random dresser (maximum information) is pretty meaningless, but so is a dresser filled will identical shirts (minimum information). While each are described by contrasting states of Shannonian information, neither maximum nor minimum information systems appear very interesting.
One might also imagine that neither the random nor the homogeneous systems are all that functional. A dresser filled with identical shirts does not do a very good job of meeting my diverse requirements for dressing (clothing for different occasions or different body parts), but my random dresser, while meeting these needs, can't function well because it takes me forever to sort through.
Similarly, systems with too much order cannot respond to a world filled with different kinds of situations. Furthermore, they are more vulnerable to system disruption. If you have a forest filled with identical tree species, one destructive insect infestation might have the capacity to wipe out the entire system. If I own 100 identical shirts and it goes out of style, I suddenly have nothing to wear.
Meanwhile, if everything is distributed at random then functional differences can't arise: a mature forest eco-system has collections of species that work together, processing environmental inputs in ways that syphon resources effectively - certain species are needed moreso than others. In my dresser, I need to find the right balance between shirts, socks, and shorts: some things are worn more than others and I will run into shortages of some, and excesses of others, if I am not careful.
What is interesting in Complexity, is that it appears that, in order to be responsive to a world that consists of different kinds of inputs, complex systems tune themselves to information states involving just enough variety (lots of different kinds of clothes/lots of different tree species) and just enough homogeneity (clusters of appropriately scaled groups of clothing or species). While within their boundaries these systems violate the second-law of thermodynamics (gaining order), they do not gain so much order as to become homogenous. The phrase 'poised at the edge of order and chaos' seems to capture this dynamic.
Imagine we have a system looking to optimize a particular behavior - say an ant colony seeking food. We place an assortment of various-sized bread crumbs on a kitchen table, and leave our kitchen window open overnight. Ants march in through the window, along the floor, and up the leg of the table.
Which way should they go?
From the ants perspective, there is maximum uncertainty about the situation: or maximum Shannonian information. The ants spread out in all directions, seeking food at random. Suddenly, one ant finds food, and joyfully secretes some pheromones as it carries it away. The terrain of the table is no longer totally random: there is a signal - food here! Nearby ants pick up the pheromone signal and, rather than moving at random, they slightly adjust their trajectories. The ant's level of uncertainty about the situation has been reduced or, put another way, the pheromone trail represents a compression of informational uncertainty - going from 'maximum information required' (search every space), to 'reduced information required' (search only spaces near the pheromone trace).
If all ants had to independently search every square inch of tabletop to find food, each would require maximum information about all table states. If, instead, they can be steered by signals (see Stigmergy) deployed by other ants, they can then limit their search to only some table states. By virtue of the collective, the table has become more ‘organized’ in that it requires less information to navigate towards food. There is a reduction of uncertainty, or reduction of 'information bits', required by each ant to find the location of 'food bits'. Accordingly, these are more easily discovered. It is worth noting that in this particular system, the "food bits" are effectively the Driving Flows that energize the system and thereby help fuel the localized order. The second law is preserved, since the ants will ultimately dissipate this order (through heat generated in their movements, through ant deffication as they process food, and ultimately through death and decay).
Suppose we are playing 20 questions. I am thinking of the concept ‘gold’, and you are required to go through all lists of persons, places, and things in order to eventually identify ‘gold’ as the correct entity. Out of a million possible entities that I might be thinking of, how long would it take to find the right one in a sequential manner? Clearly, this would involve a huge length of time. The system has maximum uncertainty (1 million bits), and each sequential random guess reduces that uncertainty by only 1 bit (999,999 bits to go after the first guess!). While I might 'strike gold' at any point, the odds are low!
From an information perspective, we can greatly reduce the time it takes to guess the correct answer if we structure our questions so as to minimize our uncertainty at every step. Thus if I have 1,000,000 possible answers in the game ‘twenty questions’, I am looking for questions that will reduce these possibilities to the greatest extent at each step. If, with every question, I can reduce the possible answers in half (binary search) then, within 20 question, I can generally arrive at the solution. In fact the game, when played by a computer, can solve for any given entity within an average of six guesses! With each guess, the degree of uncertainty regarding the correct answer (or the amount of Shannonian information required), is reduced.
Sorting a system so there is less to sort
From a Complex Systems standpoint, this information sorting by agents within a system will allow it to channel resources more effectively – that is, focus on work (or questions) that move towards success while engaging in less wasted effort.
To illustrate: imagine that I wish to move to a new city to find a job. I can choose one of ten cities, but other than their names, I know nothing about them, including their populations. I relocate at random and find myself in a city of 50 people, with no job postings. My next random choice might bring me to a bigger center, but, without any information, I need to keep re-locating until I land in a place where I can find work.
If, instead, the only piece of information that I have is the city populations, I can make a judgement: If I start off my job hunt in larger centers then there is a better chance that jobs matching my skills will be on offer. I use the population sizes as a way to filter out certain cities from my search - perhaps with a 'rule' stating that I won't consider relocating to cities with less than 1 million inhabitants. This rule might cross out six cities from my search list, and this 'crossing out' is equivalent to reducing information bits required to find a job: I can decide that my efforts are better spent focusing on a job search in only four cities instead of ten (this may also be the reason why, in studying cities as complex systems, we often observe the phenomena of growth and preferential attachment, which manifests as Power Laws in population distributions).
By now it should have become clear that this is equivalent to my looking for a given cream molecule in only half the coffee cup, or ants looking for food only on some parts of the table, or my search in 20 questions being limited only to items in the 'mineral' category.
All these processes involve a kind of information sorting that gives rise to order, which in turn makes things go smoother: from random cities to differentiated cities; from random words to differentiated categories of words.
What complex systems are able to do is take a context that, while initially undifferentiated, can be sorted by the agents in the system such that the agents in the system can navigate through it more efficiently. This always involves a local violation of the second law of thermodynamics, since the amount of Shannonian information (the entropy or disorder of the system), is always being reduced. That said, this can only occur if there is some inherent difference in the system, or 'something to sort' in the first place. If a context is truly homogeneous (going back to our dresser of identical shirts), then no amount of system rearranging can make it easier to navigate. Note that an undifferentiated system is different from a homogenous system. A random string of letters is undifferentiated; a string composed solely of the letter 'A' is homogeneous.
Accordingly, complex systems need to operate in a context where some kind of differential (in the form of Driving Flows are present. The system then has something to work with, in terms of sorting through the kinds of differences that might be relevant.
One thing to be very aware of in the above example, is how difficult it is to disambiguate information from orderliness. As our knowledge of probable system states becomes more orderly, Shannonian information is reduced. This is a frustrating aspect of the term ‘information’, and can lead to a lot of confusion.
This Christmas Story illustrates how binary search can quickly identify an entity
Photo Credit and Caption: Underwater image of fish in Moofushi Kandu, Maldives, by Bruno de Giusti (via Wikimedia Commons)
Cite this page:
Wohl, S. (2022, 10 June). Information. Retrieved from https://kapalicarsi.wittmeyer.io/definition/information-theory
Information was updated June 10th, 2022.
Related Key Concepts
This is a list of Key Concepts that Information is related to.
Related Governing Features
This is a list of Governing Features that Information is related to.
Related Urban Fields
This is a list of Urban Fields that Information is related to.
This is the feed, a series of related links and resources. Add a link to the feed →
Nothing in the feed...yet.
This is a list of People that Information is related to.
Information Theory
With Claude Shannon, developed the field of information theory
Read more and see related content for Warren Weaver →Cybernetics | Information | Differentials
This is a default subtitle for this page. Read more and see related content for Gregory Bateson →Information Theory
This is a default subtitle for this page. Read more and see related content for Claude Shannon →Reaction/Diffusion | Computation
diffusion model spots Read more and see related content for Alan Turing →This is a list of Terms that Information is related to.
Agents within a Complex system can help one another achieve more 'fit' behaviors by providing signals of past success: this 'marking' of past work is known as 'Stigmergy'.
More coming soon!
Read more and see related content for Stigmergy →Once a Complex System has 'settled in' to a fit regime (or basin of attraction), it is very difficult for it to be 'shaken' from this state. A system perturbation acts as a kind of 'shock' that, if large enough, is able to move a system out of its attractor state and potentially into a new regime.
This is a default subtitle for this page. Read more and see related content for Perturbation →The violation of the second order of dynamics - whereby systems develop and maintain order
This is a default subtitle for this page. Read more and see related content for Negentropy →A fitness landscape is a concept that employs the metaphor of a physical landscape to depict more or less 'fit' regions of phase space.
Related Terms and Topics: {{Fitness-Peaks}}, basins-of-attractions, critical-point, Tipping Points, Phase Space Read more and see related content for Fitness Landscape →Fill in all boxes here, each piece of information has a role in making pages, links, and visualizations in the site work well.
Fill in all boxes here, each piece of information has a role in making pages, links, and visualizations in the site work well.
You can do two things here. First, you can influence the order of the diagrams for this topic by clicking on the () and () buttons. The diagrams with more plus votes will rise to the top. The second thing you can do here is rate the diagrams against the evolving set of attributes that you can learn more about here. You can only rate each thing once but as the attribute list changes, you can come back and update ratings.
Fill in all boxes here, each piece of information has a role in making pages, links, and visualizations in the site work well.
Fill in all boxes here, each piece of information has a role in making pages, links, and visualizations in the site work well.
Fill in all boxes here, each piece of information has a role in making pages, links, and visualizations in the site work well.
Fill in all boxes here, each piece of information has a role in making pages, links, and visualizations in the site work well.
Fill in all boxes here, each piece of information has a role in making pages, links, and visualizations in the site work well.
Fill in all boxes here, each piece of information has a role in making pages, links, and visualizations in the site work well.