cas/definition/feature.php (core concept)
Diagram: Information

Information

What drives complexity? The answer involves a kind of sorting of the differences the system must navigate. These differences can be understood as flows of energy or information.

In order to be responsive to a world consisting of different kinds of inputs, complex systems tune themselves to states holding just enough variety to be interesting (keeping responsive) and just enough homogeneity to remain organized (keeping stable). To understand how this works, we need to understand flows of information in complex systems, and what "information" means.


Complex Systems are ones that would appear to violate the second law of thermo-dynamics: that is to say, order manifests out of disorder. Another way to state this it that actions within the boundary of the systems are one where order (Negentropy), increases over time. This appears counter to the second law of thermodynamics which states that, left to its own devices, a system's disorder (entropy) will increase. Thus, we expect that, over time, buildings break down, and a stream of cream poured into a cup of coffee will dissipate. We don’t expect a building to rise from the dust, nor a creamy cup of coffee to partition itself into distinct layers of and cream and coffee.

Yet similar forms of unexpected order arise in complex systems. The reason this can occurs is that complex systems are not fully bounded - they are {{open-dissipative}} structures that are subject to some form of energy entering from the outside, and within these "loose" boundaries, we see glimpses of temporary order.  Disorder is, however, still being ejected outside of these same boundaries - stuff comes in, stuff goes out -  in some other form. It is only within the boundaries that we see temporary pockets of order. In order to get a better grasp on how these pockets of temporary order appear, we need to understand the relationship between entropy  (disorder, or randomness) and information.

PART I: Understanding Information

Shannonian Information

An important way of thinking about this increase in order relates to concepts based in information theory.  Information theory, as developed by Claude Shannon, evaluates systems based upon the amount of information or 'bits' required to describe them.

Shannon might ask, what is the amount of information required to know where a specific molecule of cream is located in a cup of coffee? Further, in what kinds of situations would we require more or less information to specify a location?

Example:

In a mixed, creamy cup of coffee, any location is equally probable for any molecule of cream. We therefore have maximum uncertainty about location:  the situation has high entropy, high uncertainty, and requires high information content to specify a location.  By contrast, if the cream and coffee were to be separated (say in two equal layers with the cream at the top) we would now have a more limited range of locations where a particular bit of cream might be placed. Our degree of uncertainty about the cream's location has been reduced by half, since we now know that any bit of cream has to be located somewhere in the upper half of the cup - all locations at the bottom of the cup can be safely ignored.

Information vs Knowledge

Counterintuitively, the more Shannonian information required to describe a system, the less structured or "orderly" it appears to us.  Thus, as a system become more differentiated and orderly - or as emergence features arise - its level of Shannon information diminishes.

This, in a way, is unfortunate: our colloquial understanding of 'having a lot of information', pertains to us knowing more about something. Thus, seeing a cup of coffee divided in cream and coffee layers, we perceive something with more structure, more logic, and we might assume this it should follow that this conveys more information to us (at least in our normal ways of thinking about information - in this case that coffee and cream are different things!). A second, stirred cup appears more homogenous – it has less structure or organization. And yet, it requires more Shannon information to describe it.

A difficulty thus lies in how we tend to intuitively consider the words ‘disorder’ and ‘information’. We associate disorder with lack of structure (or low amounts of information) and order with more knowledge and, therefore, more information).

While intuitively correct, unfortunately this is not how things works from the perspective of information and communication signals - which is what Shannon was concerned with when formulating his ideas.  Shannon  (who worked for Bell Laboratories) was trying to understand the required bits of information needed to relay the state of a system (or a signal).

Example:

Imagine I have an extremely messy dresser and I am looking for my favorite shirt. I open my dresser drawers and see a jumble of miscellaneous clothes: socks, shirts, shorts, underwear. I rifle through each drawer examining each item to see if it, indeed, is the shirt I am seeking. To find the shirt I want (which could be anywhere in the dresser), I require maximum information, since the dresser is in a state of maximum disorder.
Thankfully I spend the weekend sorting through my clothes. I divide the dresser by category, with separate socks, shirts, shorts, and underwear drawers. Now, if I wish to find my shirt,  my uncertainty about its location has been reduced by one quarter (assuming four drawers in the dresser). To discover the shirt in the dresser's more ordered state requires less information:  I can limit myself to looking in one drawer only.

Let us take the above example a little further:

Imagine that I love this particular shirt so much that I buy 100 copies of it, so many that they now fill my entire dresser. The following morning, upon waking, I don't even bother to turn on the lights. I reach into a drawer (any drawer will do), and pull out my favorite shirt!

My former, messy dresser had maximum disorder (high entropy), and required a maximum amount of Shannon Information ('bits' of information to find a particular shirt).  By contrast, the dresser of identical shirts, has maximum order (negentropy), and requires a minimal amount of Shannon Information (bits) to find the desired shirt.

Interesting information:  States that matter!

It should be noted that the two extreme states illustrated above are both pretty uninteresting. A fully random dresser (maximum information) is pretty meaningless, but so is a dresser filled will identical shirts (minimum information). While each are described by contrasting states of Shannonian information, neither maximum nor minimum information systems appear very interesting.

One might also imagine that neither the random nor the homogeneous systems are all that functional. A dresser filled with identical shirts does not do a very good job of meeting my diverse requirements for dressing (clothing for different occasions or different body parts), but my random dresser, while meeting these needs, can't function well because it takes me forever to sort through.

Similarly, systems with too much order cannot respond to a world filled with different kinds of situations. Furthermore, they are more vulnerable to system disruption. If you have a forest filled with identical tree species, one destructive insect infestation might have the capacity to wipe out the entire system. If I own 100 identical shirts and it goes out of style, I suddenly have nothing to wear.

Meanwhile, if everything is distributed at random then functional differences can't arise: a mature forest eco-system has collections of species that work together, processing environmental inputs in ways that syphon resources effectively - certain species are needed moreso than others. In my dresser, I need to find the right balance between shirts, socks, and shorts: some things are worn more than others and I will run into shortages of some, and excesses of others, if I am not careful.

PART II:  Information Sorting in Complex Systems

Between Order and Disorder

What is interesting in Complexity, is that it appears that, in order to be responsive to a world that consists of different kinds of inputs, complex systems tune themselves to information states involving just enough variety (lots of different kinds of clothes/lots of different tree species) and just enough homogeneity (clusters of appropriately scaled groups of clothing or species). While within their boundaries these systems violate the second-law of thermodynamics (gaining order), they do not gain so much order as to become homogenous. The phrase 'poised at the edge of order and chaos' seems to capture this dynamic.

Tuning a complex system -  decreasing uncertainty

Imagine we have a system looking to optimize a particular behavior - say an ant colony seeking food. We place an assortment of various-sized bread crumbs on a kitchen table, and leave our kitchen window open overnight. Ants march in through the window, along the floor, and up the leg of the table.

Which way should they go?

From the ants perspective, there is maximum uncertainty about the situation: or maximum Shannonian information. The ants spread out in all directions, seeking food at random. Suddenly, one ant finds food, and joyfully secretes some pheromones as it carries it away. The terrain of the table is no longer totally random: there is a signal - food here! Nearby ants pick up the pheromone signal and, rather than moving at random, they slightly adjust their trajectories. The ant's level of uncertainty about the situation has been reduced or, put another way, the pheromone trail represents a compression of informational uncertainty - going from 'maximum information required' (search every space), to 'reduced information required' (search only spaces near the pheromone trace).

If all ants had to independently search every square inch of tabletop to find food, each would require maximum information about all table states. If, instead, they can be steered by signals (see Stigmergy) deployed by other ants, they can then limit their search to only some table states. By virtue of the collective, the table has become more ‘organized’ in that it requires less information to navigate towards food. There is a reduction of uncertainty, or reduction of 'information bits', required by each ant to find the location of 'food bits'. Accordingly, these are more easily discovered. It is worth noting that in this particular system, the "food bits" are effectively the Driving Flows that energize the system and thereby help fuel the localized order. The second law is preserved, since the ants will ultimately dissipate this order (through heat generated in their movements, through ant deffication as they process food, and ultimately through death and decay). 

Reduce information | Reduce effort

Suppose we are playing 20 questions.  I am thinking of the concept ‘gold’, and you are required to go through all lists of persons, places, and things in order to eventually identify ‘gold’ as the correct entity. Out of a million possible entities that I might be thinking of, how long would it take to find the right one in a sequential manner? Clearly, this would involve a huge length of time. The system has maximum uncertainty (1 million bits), and each sequential random guess reduces that uncertainty by only 1 bit (999,999 bits to go after the first guess!). While I might 'strike gold' at any point, the odds are low!

From an information perspective, we can greatly reduce the time it takes to guess the correct answer if we structure our questions so as to minimize our uncertainty at every step. Thus if I have 1,000,000 possible answers in the game ‘twenty questions’, I am looking for questions that will reduce these possibilities to the greatest extent at each step. If, with every question, I can reduce the possible answers in half (binary search) then, within 20 question, I can generally arrive at the solution. In fact the game, when played by a computer, can solve for any given entity within an average of six guesses! With each guess, the degree of uncertainty regarding the correct answer (or the amount of Shannonian information required), is reduced.

Sorting a system so there is less to sort

From a Complex Systems standpoint, this information sorting by agents within a system will allow it to channel resources more effectively – that is, focus on work (or questions) that move towards success while engaging in less wasted effort.

To illustrate:  imagine that I wish to move to a new city to find a job. I can choose one of ten cities, but other than their names, I know nothing about them, including their populations. I relocate at random and find myself in a city of 50 people, with no job postings. My next random choice might bring me to a bigger center, but, without any information, I need to keep re-locating until I land in a place where I can find work.

If, instead, the only piece of information that I have is the city populations, I can make a judgement: If I start off my job hunt in larger centers then there is a better chance that jobs matching my skills will be on offer. I use the population sizes as a way to filter out certain cities from my search - perhaps with a 'rule' stating that I won't consider relocating to cities with less than 1 million inhabitants. This rule might cross out six cities from my search list, and this 'crossing out' is equivalent to reducing information bits required to find a job: I can decide that my efforts are better spent focusing on a job search in only four cities instead of ten (this may also be the reason why, in studying cities as complex systems, we often observe the phenomena of growth and preferential attachment, which manifests as Power Laws in population distributions).

By now it should have become clear that this is equivalent to my looking for a given cream molecule in only half the coffee cup, or ants looking for food only on some parts of the table, or my search in 20 questions being limited only to items in the 'mineral' category.

All these processes involve a kind of information sorting that gives rise to order, which in turn makes things go smoother: from random cities to differentiated cities; from random words to differentiated categories of words.

What complex systems are able to do is take a context that, while initially undifferentiated, can be sorted by the agents in the system such that the agents in the system can navigate through it more efficiently. This always involves a local violation of the second law of thermodynamics, since the amount of Shannonian information (the entropy or disorder of the system), is always being reduced. That said, this can only occur if there is some inherent difference in the system, or 'something to sort' in the first place. If a context is truly homogeneous (going back to our dresser of identical shirts), then no amount of system rearranging can make it easier to navigate. Note that an undifferentiated system is different from a homogenous system. A random string of letters is undifferentiated; a string composed solely of the letter 'A' is homogeneous.

Accordingly, complex systems need to operate in a context where some kind of differential  (in the form of Driving Flows are present. The system then has something to work with, in terms of sorting through the kinds of differences that might be relevant.

One thing to be very aware of in the above example, is how difficult it is to disambiguate information from orderliness. As our knowledge of probable system states becomes more orderly, Shannonian information is reduced.  This is a frustrating aspect of the term ‘information’, and can lead to a lot of confusion.

This Christmas Story illustrates how binary search can quickly identify an entity


Back to Core Concepts

Back to Navigating Complexity

 


Cite this page:

Wohl, S. (2022, 10 June). Information. Retrieved from https://kapalicarsi.wittmeyer.io/definition/information-theory

Information was updated June 10th, 2022.

Nothing over here yet

In Depth: Information

This is the feed, a series of related links and resources. Add a link to the feed →

Nothing in the feed...yet.

This is a list of People that Information is related to.

Information Theory

With Claude Shannon, developed the field of information theory

Learn more →

Cybernetics | Information | Differentials

This is a default subtitle for this page. Learn more →

Information Theory

This is a default subtitle for this page. Learn more →

Reaction/Diffusion | Computation

diffusion model spots Learn more →

This is a list of Terms that Information is related to.

Agents within a Complex system can help one another achieve more 'fit' behaviors by providing signals of past success: this 'marking' of past work is known as 'Stigmergy'.

More coming soon!

Learn more →

Once a Complex System has 'settled in' to a fit regime (or basin of attraction), it is very difficult for it to be 'shaken' from this state. A system perturbation acts as a kind of 'shock' that, if large enough, is able to move a system out of its attractor state and potentially into a new regime.

This is a default subtitle for this page. Learn more →

The violation of the second order of dynamics - whereby systems develop and maintain order

This is a default subtitle for this page. Learn more →

A fitness landscape is a concept that employs the metaphor of a physical landscape to depict more or less 'fit' regions of phase space.

Related Terms and Topics: {{Fitness-Peaks}}, basins-of-attractions, critical-point, Tipping Points, Phase Space Learn more →

This is a collection of books, websites, and videos related to Information

By Sharon Wohl

This resource if from a course on complex systems taught by Sharon Wohl

Learn more →

This is a list of Urban Fields that Information is related to.

Increasingly, data is guiding how cities are built and managed. 'Datascapes' are both derived from our actions but then can also steer them. How do humans and data interact in complex ways?

More and more, the proliferation of data is leading to new opportunities in how we inhabit space. How might a data-steered environment operate as a complex system?

Learn more →

Across the globe we find spatial clusters of similar economic activity. How does complexity help us understand the path-dependent emergence of these economic clusters?

Evolutionary Economic Geography (EEG) tries to understand how economic agglomerations or clusters emerge from the bottom-up. This branch of economics draws significantly from principles of complexity and emergence, seeing the rise of particular regions as path-dependent, and looking to understand the forces that drive change for firms - seen as the agents evolving within an economic environment.

Learn more →

Communicative planning  broadens the scope of voices engaged in planning processes. How does complexity help  us understand the productive capacity of these diverse agents?

A growing number of spatial planners are realizing that they need to harness many voices in order to navigate the complexities of the planning process. Communicative strategies aim to move from a top-down approach of planning, to one that engages many voices from the bottom up.

Learn more →

This is a list of Key Concepts that Information is related to.

Open & dissipative systems, while 'bounded' by internal dynamics,  nonetheless exchange energy with their external environment.

A system is considered to be open and dissipative when energy or inputs can be absorbed into the system, and 'waste' discharged. Here, system inputs like heat, energy, food, etc., can traverse the open boundaries of the system and ‘drive’ it towards order: seemingly in violation of the second law of thermodynamics.

Learn more →

Network theory allows us think about how the dynamics of agent interactions in a complex system can affect the performance of that system.

Network theory is a huge topic in and of itself, and can be looked at on its own, or in relation to complex systems. There are various formal, mathematical ways of studying networks, as well as looser, more fluid ways of understanding how networks can serve as a structuring mechanism. Learn more →

There would be some thought experiments here.

Navigating Complexity © 2015-2024 Sharon Wohl, all rights reserved. Developed by Sean Wittmeyer
Sign In (SSO) | Sign In


Test Data
Related (this page): Evolutionary Geography (12), Driving Flows (25), Open / Dissipative (84), Networks (75), 
Section: concepts
Non-Linearity
Related (same section): Tipping Points (218, concepts), Path Dependency (93, concepts), Far From Equilibrium (212, concepts), 
Related (all): Urban Modeling (11, fields), Resilient Urbanism (14, fields), Relational Geography (19, fields), Landscape Urbanism (15, fields), Evolutionary Geography (12, fields), Communicative Planning (18, fields), Assemblage Geography (20, fields), 
Nested Orders
Related (same section): Self-Organized Criticality (64, concepts), Scale-Free (217, concepts), Power Laws (66, concepts), 
Related (all): Urban Modeling (11, fields), Urban Informalities (16, fields), Resilient Urbanism (14, fields), 
Emergence
Related (same section): Self-Organization (214, concepts), Fitness (59, concepts), Attractor States (72, concepts), 
Related (all): Urban Modeling (11, fields), Urban Informalities (16, fields), Urban Datascapes (28, fields), Incremental Urbanism (13, fields), Evolutionary Geography (12, fields), Communicative Planning (18, fields), Assemblage Geography (20, fields), 
Driving Flows
Related (same section): Open / Dissipative (84, concepts), Networks (75, concepts), Information (73, concepts), 
Related (all): Urban Datascapes (28, fields), Tactical Urbanism (17, fields), Relational Geography (19, fields), Parametric Urbanism (10, fields), Landscape Urbanism (15, fields), Evolutionary Geography (12, fields), Communicative Planning (18, fields), Assemblage Geography (20, fields), 
Bottom-up Agents
Related (same section): Rules (213, concepts), Iterations (56, concepts), 
Related (all): Urban Modeling (11, fields), Urban Informalities (16, fields), Resilient Urbanism (14, fields), Parametric Urbanism (10, fields), Incremental Urbanism (13, fields), Evolutionary Geography (12, fields), Communicative Planning (18, fields), 
Adaptive Capacity
Related (same section): Feedback (88, concepts), Degrees of Freedom (78, concepts), 
Related (all): Urban Modeling (11, fields), Urban Informalities (16, fields), Tactical Urbanism (17, fields), Parametric Urbanism (10, fields), Landscape Urbanism (15, fields), Incremental Urbanism (13, fields), Evolutionary Geography (12, fields),