Landing for the Concepts

This is a nice home page for this section, not sure what goes here.

218:218 - Tipping Points
Related
Principles: 26 

64:64 - Self-Organized Criticality
Related
Principles: 23 

214:214 - Self-Organization
Related
Principles: 24 

217:217 - Scale-Free
Related
Principles: 23 

213:213 - Rules
Related
Principles: 22 

66:66 - Power Laws
Related
Principles: 23 

93:93 - Path Dependency
Related
Principles: 26 

84:84 - Open / Dissipative
Related
Principles: 25 

75:75 - Networks
Related
Principles: 25 

56:56 - Iterations
Related
Principles: 

73:73 - Information
Related
Principles: 25 

59:59 - Fitness
Related
Principles: 24 

88:88 - Feedback
Related
Principles: 21 

212:212 - Far From Equilibrium
Related
Principles: 26 

78:78 - Degrees of Freedom
Related
Principles: 22 

53:53 - Cybernetics
Related
Principles: 

72:72 - Attractor States
Related
Principles: 24 

 

Tipping Points

A tipping point (often referred to as a 'critical point') is a threshold within a system where the system shifts from manifesting one set of qualities to another.

Related Terms/Topics Bifurcations; Fitness Landscape; Attractor States; {{critical-point}}; {{catastrophe}}

Most of us are familiar with the phrase 'tipping point'. We tend to associate it with moments of no return: when overfishing crosses a threshold that causes fish stocks to collapse or when social unrest reaches a breaking point resulting in riot or revolution. The concept is often associated with an extreme shift, brought about by what seems to be a slight variance in what had been incremental change. A system that seemed stable is pushed until it reaches a breaking point, at which point a small additional push results in a dramatic shift in outcomes.


While the phrase 'tipping point' tends to connote a destructive shift, the phrase 'critical point' (which also refers to a large shift in outcomes due to what appears to be a small shift of the system context) does not carry such value-laden implications. Complex systems tend to move into different kinds of regimes of behavior, and the shift from one behavior to another can be quite abrupt: indicating that the system has passed through a critical point.

Example:

Water molecules respond to two critical points: zero degrees, when they shifts from fluid to solid state; and one hundred degrees, when they shift from fluid to vapor state. We see that the kinds of behavior that water molecules will obey is context dependent:  they maintain fluid behaviors within, and only within, the context of a certain temperature range. If we examine why the behavior of the water changes, we realize that fluid behavior within the zero to 100 range is the behavior that involves the least possible energy expenditure on the part of the water molecules given their environmental context. Once this context shifts - becoming too cold or too hot - a particular behavioral mode is no longer that which best conserves energy. Water molecules have the capacity to enact three different kinds of behavioral modes - frozen, fluid, or vapor - and the way these modes come to be enacted is subject to whichever mode involved the least energy expenditure within a given context.

Minimizing Processes:

Another way to think about this, using a complex systems perspective, is that the global behavioral dynamics are moving from one Attractor States to another. When the context changes, the water molecules are forced into a different "basin of attraction (another word for an attractor state), and this triggers a switch in their mode.

In all complex systems this switch from one basin of attraction to another is simply the result of a system moving from a regime of behavior that, up until a certain point, involved a minimized energy expenditure. Beyond that point (the tipping point) another kind of behavioral regime encounters less resistance, conserving energy expenditures given a shifting context.

A tipping point, or critical point is one where a system moves from one regime of 'fit' behavior into another

Of course, what we mean by 'conserving energy' is highly context-dependent. That said, even though the individual members of a political uprising are very different actors from individual water molecules in a fluid medium, the dynamics at play are in fact very similar. Up until a certain critical mass is obtained, resisting a government or a policy involves encountering a great deal of resistance. The effort might feel futile - 'a waste of energy'. But when a movement begins to gain momentum, there can be a sense that the force of the movement is stronger than the institutions that it opposes. Being 'carried along' with the movement (joining an uprising), is in fact the course of action that is most in alignment with the forces being unleashed.

Further, once a critical mass is reached, a movement will tend to accelerate its pace due to positive feedback. This can have both positive and negative societal consequences: some mass movement such as lynching mobs or bank-runs show us the downside of tipping points that move beyond a threshold and then spiral out of control.

That said, understanding that critical points may exist in the system (beyond which new kinds of behavior become feasible), can help us move outside of 'ruts' or 'taken for granted' scenarios. In the North American context, smoking was an acceptable social practice in public space. Over time, societal norms pushed public smoking beyond a threshold of acceptability, at which point smoking went from being a normative behavior to something that, while tolerated, is ostracized in the public realm.

What other kinds of activities might we wish to encourage and discourage? If we realize that a behavioral norm is close to a critical point, then perhaps with minimal effort we can provide that additional 'push' that moves it over the edge.

Shifting Environmental Context:

Of course these examples are somewhat metaphoric in nature, but the point being made is that there can be changes in physical dynamics and changes in cultural dynamics that cause different kinds behaviors to become more (or less) viable within the constraints of the surrounding context.

Returning to physical systems, slime mould is a very unique organism that has the capacity to operate either as a collective unit, or as a collection of individual cells, depending on the inputs provided by the environmental context. As long as food sources are readily available, the mould operates as single cells. That said, when food becomes scarce, a critical point is reached when cells agglomerate to form a collective body with differentiated functions. This new body has capacities for movement and food detection not available at the individual cell level, as well as other kinds of reproductive capacities.

Accordingly, we cannot think about the behavior of a complex system without considering the context within which it is embedded. The system may have the different kinds of capacities depending on how the environment interacts with and 'triggers' the system. It is therefore important to be very aware of the environmental coupling of a system. What might appear to be stable behavior might in fact be behavior that is relying on certain environmental features being present - change these features and entirely new kinds of behaviors might manifest.


 

Governing Features ↑

Self-Organized Criticality

CAS tend to organize to a 'critical state' where, regardless of the scale of a given input, the scale of corresponding output observes of a power-law distribution.

Strike a match and drop it in the forest. How big will the resulting fire be? The forest is dry but not overly so... vegetation is relatively thick. Will the fire burn a few trees and then flame out, or will it jump from branch to branch, burning thousands of acres to the ground?


Weirdly uncorrelated cause and effect:

We might think that the scale of an event is relative to the scale of a cause, and in some instances this is indeed the case. But in the context of complex systems, we find an interesting phenomena. These systems appear to 'tune' themselves to a point whereby system inputs of identical intensities (two matches lit on two different days, otherwise same conditions), result in outputs that diverge wildly (a small fire; a massive fire event). The frequency distribution of intense system outputs (relative to equivalent system inputs) follows power-law regularities.

According to Per Bak, a variety of systems naturally 'tune' themselves to operate at a threshold where such dynamics occurs. He defined this 'tuning' as Self-Organized Criticality.  A feature of critical states is that, once reached, system components being highly correlated or linked to other system components. That said, the links are exactly balanced: the system elements are linked just tightly enough so that an input at any point can cascade through the entire system, but just loosely enough, so that there are no redundant links needed to make this occur.

Example:

One might think about this like an array of domino-like entities that, instead of being rectangular, are vertical cylinders: able to topple in any direction. The dominos, instead of being arranged in rows, are arranged in a field, with gaps between some cylinders. Accordingly, when a cylinder falls it might strike a gap in the field, with no additional cylinders toppling. Alternately, it might strike an adjacent neighbor, in which case this neighbor will also fall in a particular direction, potentially striking another or potentially dying out. The analogy is made stronger if we imagine that an arrangement whereby, regardless of the direction from which a cylinder is struck, it can fall in any direction.  When a system is self-critical, it has reached a state where we can randomly choose any domino to topple and the impact on the overall field will vary according to a power-law distribution. That is to say, that some disturbance will affect only a small number of surrounding dominos, while others will propogate throughout the entire system, causing all cylinders to fall. The occurrence of these large scale versus small-scale cascades follows a {{power-law}} HANDLEBAR FAIL distribution.

Sand Piles and Avalanches

We can imagine that it would be very difficult to, from the top down, create a specific arrangement where such dynamics occur. What is surprising, and what Bak and his colleagues showed, is that natural systems will independently 'tune' themselves to such arrangements. Bak famously provides us with the 'sand pile' model as an example of self-organized criticality:

Imagine that we begin to drop a steady stream of grains of sand onto a surface. The sand begins to pile up, forming a cone shape. As more sand is added the height of the sand cone grows, and there begins to be a series of competing forces: the force of gravity that tends to drag grains of sand downwards, the friction between grains of sand that tends to hold them in place, and the input of new sand grains that tends to put pressure on both of these forces.

What Bak demonstrated is that sand will dislodge itself from the pile, cascading downwards,  but that it is impossible to predict whether dropping an individual sand grain will result in a tiny avalanche of sand cascades, or a massive avalanche. That said, it is possible to predict the ratio of cascade events over time - which follows a power-law distribution.

What this suggests is that the sand grains cease to operate independently to forces, and instead their response to forces is highly correlated with that of the other sand grains. We no longer have a collection of grains, acting independently, but a system of grains whereby system-wide behaviors are displayed. Accordingly, an input that affects one element in the system might die out then and there, or, because of the correlation amongst all elements, create a chain reaction.

Information Transfer

It remains unclear exactly how such system-wide correlations emerge, but we do know something about the nature of these correlations - they are tuned to the point where information is able to propagate through the system with maximum efficiency. In other words, a message or input at one node in the system (a grain of sand or a toppling cylinder) has the capacity to reach all other nodes, but this with the least redundancy possible. In other words, there are gaps in the system which means that a majority of inputs ultimately die out, but not so many gaps that it is impossible for an input to reach all elements of the system.

Coming back to our original example, when we strike a match in a forest, if the forest has achieved a 'self-critical' state, then we cannot know whether the resulting fire will spread only to a few trees, a large cluster of trees, or cascade through the entire forest. The only thing that we can know is that the largest scale events will happen  with diminishing frequency in comparison to the small scale events.

One possible way of understanding why self-organized criticality occurs is to position it as a process that emerges in systems that are affected both by a pressure to have elements couple with one another (sand-grains becoming interlocked by friction or 'sticky') and some mechanism that acts upon the system to loosen such couplings (the force of gravity pulling grains apart). The feedback between these two pressures 'tunes' the system to a critical state.

Complex systems that exhibit power-laws would seem to exhibit such interactions between two competing and unbalanced forces. See, for example {{bifurcations}}.



 

Governing Features ↑

Self-Organization

self-organization refers to processes whereby coordinated patterns or behaviors manifest in a system without the need for top-down control.

A system is considered to be self-organizing when the behavior of elements in the system can, together, arrive at a globally more optimal functional regimes compared to if each system element behaved independently. This occurs without the benefit of any controller or director of action. Instead, the system contains elements acting in parallel (rather than hierarchically) that will gradually begin to manifest correlated behaviors.  The collective behaviors of the individual elements become organized into a regular form or pattern of behavior (Emergence). Further, this pattern has properties that do not exist at the level of the independent elements - that is, there is a degree of unexpectedness or novelty in what manifests at the group level as opposed to what occurs at the individual level.


An example of an emergent phenomena generated by self-organization is flock behavior, where the flock manifests an overall identity distinct from that of any individual bird.

Characterizing 'the self' in 'Self'-organization

Let us begin by disambiguating self-organizing emergence from other kinds of processes that might also lead to global, collective outcomes.

Example - Back to School:

Imagine you are a school teacher, telling your students to form a line leading to their classroom. After a bit of chaos and jostling you will see a linear pattern form that is composed of individual students. At this point, 'the line' has a collective identity that transcends that of any given individual: it is a collective manifestation with an intrinsic identity (don't cut in the line!).  The line is created by individual components, expresses new global properties, but it's appearance is not the result of self-organization, it is the result of a top-down control mechanism.

Clearly 'selves' organize in this example, but not in ways that are 'self-organizing'.

Now imagine instead that you are a school teacher wanting the same group of students to play a game of tug-a-war in the school gym. Beginning with a blended room of classmates, you ask the students to pick teams. The room quickly partitions into two collectives:  one composed entirely of girls and the other entirely of boys. As a teacher, you might not appreciate this self-organization, and attempt to exert top-down control in an effort to balance team gender. What is interesting about this case is that it does not require any one boy calling out 'all the boys on this side', or any one girl doing the same: the phenomena of 'boys versus girls' self-organizes.

In the example above, we can well imagine the reasons why school teams might tend to partition into 'girls vs boys' even without explicit coordination (of course these dynamics don't always appear, but I am sure the reader can imagine lots of situations where they do).

Here, there are slight preferences (we can think of these as differentials), that generate a tendency for the elements of the system to adjust their behaviors one way vs another. In the case of the school children, the tendencies of girls to cluster with girls manifests due to tacit practices: friends cluster near friends, and as clusters appear students switch sides to be nearer those most 'like' them. Even if an individual child within this group has no preference - is equally friends with girls and boys - the pressures of patterns formed by the collective will tend to tip the balance. One girl alone in a team of boys will register that their behavior is non-conforming and feel pressured to switch sides, even if this is not explicitly stated.

Here there are 'selves' with individual preferences, but global behaviors are tipped into uniformity by virtue of slight system differences that tend to coordinate action.

Conscious vs unconscious self-organization:

While the gym example should be pretty intuitive, what is interesting is that there are many physical systems that produce this same kind of pattern formation but that do not require social cues or other forms of intentional volition. Instead, self-organization occurs naturally in a host of processes. Whether we are talking about schools of fish, ripples of wind-blown sand, or water molecules freezing into snowflakes, self-organization leading to emergent global features is a ubiquitous phenomena.

While the features of self-organization manifest differently depending on the nature of the system, there are common dynamics at play regardless of system. Agents in the system participate in a shared context wherein there exists some form of differential.  The agents in the system adjust their behaviors in accordance with slight biases in their shared context and these adjustments, though initially minor, are then amplified through reinforcing feedback that cascades through the system. Finally an emergent phenomena can be recognized.

Sync!

Let us consider the sound of cicadas chirping:

cicadas chirping in sync

The cicadas chirp in a regular rhythm. There is no conductor to orchestrate the beat of the rhythm, no head cicada leading the chorus, no one in charge. The process by which the rhythm of sound (an emergent phenomena) manifests is governed purely by the mechanism of self-organization. Let us break down the system:

  1. Agents: Chirping Cicadas
  2. Shared Context: the acoustic environment shared by all cicadas
  3. Differential: the timing of the chirps
  4. Agent Bias: adjust chirp to minimize timing differences with nearby chirps
  5. Feedback: As more agents begin to chirp in more regular rhythms, this reinforces a rhythmic tendency, further syncing chirping rhythms.
  6. Emergent Phenomena: Regular chirping rhythm.

Even if all agents in the system start off with completely different  (random) behaviors, the system dynamics will lead to the coordination of chirping behaviors.

For another example of the power of self-organization, consider this proposition: You are tasked with getting one thousand people to walk across a bridge, with their movements coordinated so that they step in perfect rhythm. You must achieve this feat on the first try (with a group of strangers of all ages who have never met one another.

It is difficult to imagine this top-down directive ending in anything other than an uncoordinated mess. But place people on the Millennium bridge in London for its grand opening and this is precisely what we get:

as the video progresses, watch the movement of people fall into sync

There are a variety of mechanisms that permit such self-organization to occur. In the millennium bridge video, the bridge provides the shared context or environment for  the walkers (who are the agents in the system). As this shared context sways slightly (differential) it throws each agent just a little bit off balance (feedback).  Each individual then slightly adjusts their stance and weight to counteract this sway (agent bias), which serves only to reinforces the collective sway direction. Over time, as the bridge sways every more violently, people are forced to move in a coordinated collective motion (emergence) in order to traverse the bridge.

What is important to note in this example is that we do not require the agents to agree with one another in order for self-organization to occur. In our earlier example - that of school children forming teams - we can imagine that a variety of factors are at work that have to do with active volition on the part of the children. But in the example above, movement preferences have nothing to do with observed walking behavior or individual preferences. Instead, the agents have become entangled with their context (which is partially formed of other agents), in ways that constrain their movement options.

Enslaved Behavior

Accordingly, in self-organizing systems agents - that initially possess a high number of possible states that are able to enact (see also Degrees of Freedom) - gradually find this range of freedom becoming increasingly limited, until such time as only a narrow band of behavior is possible.

Further, while the shared context of the agents might initially be the source of difference in the system (with difference gradually being amplified over time), in reality the context for each agent is a combination of two aspects: both the broader shared context (the bridge) and the emerging behaviors of all the other agents within that context.


 

Governing Features ↑

Scale-Free

'Scale-free' networks are ones in which identical system structure is observed for any level of network magnification.

Complex systems tend towards scale-free, nested hierarchies. By 'Scale-free', we mean to say that we can zoom in on the system at any level of magnification, and observe the same kind of structural relations. Thus, if we look at visualizations of the world wide web, we see a few instances of highly connected nodes (youtube), many instances of weakly connected nodes (your mom's cooking blog), as well as a mid-range of intermediate nodes falling somewhere in between. The weakly connected nodes greatly outnumber the highly connected nodes, but the overall statistical distribution of connected vs unconnected nodes follows a power-law distribution. Thus, if we 'zoom in' on any part of the network (at different levels of magnification), we see similar, repeated patterns.

'Scale Free' entities are therefore fractal-like, although scale-free systems generally are about the scaling of connections or flows, rather than scaling of pictoral imagery (which is what we associate with {{fractals}} HANDLEBAR FAIL or objects that exhibit Self Similarity . Accordingly, a pictoral representation of links in the world wide web does not 'look' like a fractal, but its distributions of connections observes mathematical regularities consistent with what we observe in fractals (that is to say, power laws).

A good way to think about this is that, while both scale-free systems and fractals follow power laws distributions, not all power law distributions 'look' like perfect fractals!

At the same time, sometimes the dynamics of scale free networks align with the visuals we consider to be fractal-like. A good example here is the fractal features of a leaf:


We can think of the capillary network as the minimum structure required to reach the maximum surface area.

Nature's Optimizing Algorithm

Here, the fractal, scale-free structure of the capillary network allows the most efficient transport of nutrients to all parts of the leaf surface within the overall shortest capillary path length. This 'shortest overall path length'  is one of the reasons that we might often see scale-free features in nature: this may well be the natural outcome of nature 'solving' the problem of how to best economize flow networks.

minimum global path length to reach all nodes

The two images serve to illustrate the idea of shortest overall path length. If we wish to get resources from a central node to 16 nodes distributed along a surrounding boundary, we can either trace a direct path to each point from the center, or we can partition the path into splitting segments that gradually work their way towards the boundary. While each individual pathway from the center to an individual node is longer in the right hand image, the total aggregate of all pathways to reach all nodes from the center is shorter. Thus the image on the right (which shows scale-free characteristics), is the more efficient delivery network.

Example - Street Networks:

We should therefore expect to see such forms of scale-free dynamics in other non-natural systems that carry and distribute flows: thus, if we think of size distribution of road networks in a city, we would expect a small number of key expressways carrying large traffic flows, followed by a moderate number of mid-scaled arteries carrying mid-scale flows, then a large number of neighborhood streets carrying moderate flows, and finally a very high number of extremely small alleys and roads that each carry very small flows to their respective destinations.

mud fractals and street networks

Fractals, scale-free networks, self-similar entities and power-law distributions are concepts that can be difficult to disambiguate. Not all scale-free networks look like fractals, but all fractals and scale-free networks follow power-laws. Finally, there are many power-law distributions that neither 'look' like fractals, nor follow scale-free network characteristics: if we take a frozen potato and smash it on the ground, then classify the size of each piece, we would find that the distribution of smashed potato sizes follows a power law (but is not nearly as pretty as a fractal!). Finally, self-similar entities (like the romanesco broccoli shown below) are fractal-like (you can zoom in and see similar structure at different scales), but are not as mathematically precise fractal.

credit: Wikimedia commons  (Jon Sullivan)


 

Governing Features ↑

Rules

Complex systems are composed of agents governed by simple input/output rules that determine their behaviors.

See also: {{schemata}}

Simple Rules - Complex Outcomes

One of the intriguing characteristics of complex systems is that highly sophisticated emergent phenomena can be generated by seemingly simple agents. How does one replicate the efficiencies of the Tokyo subway map? Simple - enlist slime mould and let them discover it!  Results such as these are highly counterintuitive: when we see complicated phenomena, we expect the causal structure at work to be similarly complex. However, in complex systems this is not the case. Even if the agents in a complex system are very simple, the interactions generated amongst them can have the capacity to yield highly complex phenomena.


slime mould forming the Tokyo subway map

Take it in Context

We can conceptualize  bottom-up agents as simple entities with limited action possibilities. The decision of which action possibility to deploy is regulated by basic rules that pertain to the context in which the agents find themselves. Another way to think of 'rules' is therefore to relate them to the idea of a simple set of input/output criteria.

An agent exists within a particular context that contains a series of factors considered as relevant inputs: one input might pertain to the agent's previous state (moving left or right); one might pertain to some differential in the agent's context (more or less light; and one might relate to the state of surrounding agents (greater or fewer). An agent processes these inputs and, according to a particular rule set, generates an output: 'stay the course', 'shift left', 'back up'.

input/output rule factoring three variables

In complex adaptive systems, an aspect of this 'context' must include the output behaviors generated by surrounding agents. Further, while for natural systems the agent's context might include all kinds of factors that serve as relevant inputs, in artificial complex systems novel emergent behavior can manifest even if the only thing informing the context is surrounding agent behaviors.

Example:

Early complexity models focused  precisely on the generative capacity of simple rules within a context composed purely of other agents. For example, John Conway's 'Game of Life' is a prime example of how a very basic rule set can generate a host of complex phenomena. Starting from agents arranged on a cellular grid, with fixed rules of being either 'on' or 'off' depending on the status of the agents in neighboring cells, we see the generation of a host of rich forms. The game unfolds using only four rules, that govern whether an agent is 'on' (alive) or 'off' (dead).  For every iteration:
  1. 'Off' cells turn 'On' IF they have three 'alive' neighbors;
  2. 'On' cells stay 'On' IF they have two or three 'alive' neighbors;
  3. 'On' cells turn 'Off' IF they have one or fewer 'alive' neighbors;
  4. 'On' cells turn 'Off' IF they have four or more 'alive' neighbors.
The resulting behavior has an 'alive' quality: agents flash on and off over multiple iterations, seem to converge, move along the grid, swallow other forms, duplicate, and reproduce.

Conway's Game of Life

Principle: One agent's output is another agent's input!

As we can see from the Game of Life, starting with very basic agents, who rely only on other agents outputs as their input, a basic rule set can nonetheless generate extremely rich outputs.

While the Game of Life is an artificial complex system (modeled in a computer), we can, in all real-world examples of complexity, observe that the agents of the system are both responders to inputs from their environmental context, as well as shapers of that same environmental context. These means that the behaviors of all agents necessarily become entangled - entering into feedback loops with one another.

Adjusting rules to targets

It is intriguing to observe that, simply by virtue of simple rule protocols that are pre-set by the programmer and played out over multiple iterations, complex emergent behavior can be produced. Here we observe the 'fact' of emergence from simple rules. But we can also imagine natural complex systems where agent rules also shift over time. While this could happen arbitrarily, it makes sense from an evolutionary perspective when some agent rules are more 'fit' then others. This results in a kind of selection pressure, determining which rule protocols are preserved and maintained. Here, the discovery of simple rule sets that yield better enacted results exemplifies the 'function' of emergence.

When we couple the notions of 'rules' with context, we are therefore stating that we are not interested in just any rule set that can generate emergent outcomes, but with specific rule sets that generate emergent outcomes that are in some way  'better' with respect to a given context. Successful rule systems imply a fit between the rules the agents are employing, and how well these rules assists agents (as a collective) in achieving a particular goal within a given setting.

We can think of successful rules as ones that minimizing agent effort (energy output) to resolve a given task.

As agents in a complex system enact particular rule sets, rules might be revised based on how quickly or effectively they succeed at reaching a particular target. 
When targets are achieved - 'food found!' - this information becomes a relevant system input.  Agents that receive this input may have a rule that advises them to persist in the behavior that led to the input, whereas agents that fail to achieve this input may have a rule that demands they revise their rule set!

Agent may therefore not only be conditioned by a set of pre-established inputs and outputs but also be able to revise their rules. Further, if multiple agents test different rule regimes simultaneously, then there may be other 'rules' that help agents learn from one another. If a particular rule leads agents to food, on average, in ten steps, and another rule, on average, leads agents to food in 6 steps, then agents adopting the second rule should have the capacity to disseminate their rule set to other agents, eventually suppressing the first, weaker rule.

Enacted 'rules' are therefore provisional tests of how well an output protocol achieves  a given goal. The test results then become another form of input:

bad test results also become an agent input,  telling the agent to: "generate a rule mutation as part of your next enacted output".

Novel Rule formation:

Rules might be altered in different ways. At the level of the individual -

  • an agent might choose to revise how it values or factor inputs in a new way;
  • an agent might choose to revise the nature of its outputs in a new way.

In the first instance, the impact or value assigned to particular inputs (needed to trigger an output) might change based on how successfully previous input weighting strategies were in relationship to reaching a target goal.  In order for this to occur, the agent must have the capacity to assign new 'weights' or (the value or significance) or an input, in different ways.

In the second instance, the agent requires enough inherent flexibility or 'Degrees of Freedom' to generate more than one kind of output. For example, if an agent can only be in one of two states, it has very little ability to realign outputs. But if an agent has the capacity to deploy itself in multiple ways, then there is more flexibility in the character rules it can formulate.

Rules might also be revised through processes occurring at the group level. Here, even agents are unable to alter their performance at the individual level, there may still be mechanisms operating at the level of the group which result in better rules propagating. In this case, we would have a population of agents, each with specific rule sets that vary amongst them. Even if each individual agent has no ability to revise their particular rules, at the level of the population -

  • poor rules result in agent death - there is no internal recalibration - but agents with bad rules simply cease to exist;
  • 'good' rules can be reproduced - there is no internal recalibration - but agents with good rules persist and reproduce.

We can imagine that the two means of rule adaptation - those working at the individual level and those at the population level - might work in tandem. While all of this should not seem new (it is analogous to General Darwinism), since complex systems are not always biological ones, it can be helpful to consider how the processes of system adaptation (evolution) can instead be thought of as a process of rule revision.

Through agent to agent interaction, over multiple iterations, weaker protocols are filtered out, and stronger protocols are maintained and grow. That said, the ways in which rules are revised is not entirely predictable - there are many ways in which rules might be revised. Accordingly, the trajectory of these systems is always contingent and  subject to historical conditions.

see also:

Preferential Attachment

Contingency

Fixed Rules with thresholds of enactment

That said, not all complex adaptive behaviors require that rules be revised. We began with artificial systems - cellular automata - where the agent rules are fixed but we still see complex behaviors. But there are many other natural complex systems where rules are fixed but, rather than these rules being the result of a computer programmer arbitrarily determining an input/output protocol, they are the result of fundamental laws of physics or chemistry.

One particularly beautiful example fixed rules resulting in complex behaviors inn nature is the Belousov-Zhabotinsky (BZ) chemical oscillator.  Here, fixed chemical interaction rules lead to complex form generation:

BZ chemical oscillator

In this particular reaction, as in other chemical oscillators, there are two interacting chemicals, or two 'agent populations' which react in ways that are auto-catalytic. The output generated by the production of one of the chemicals, becomes the input needed for the generation of the other chemical. Each chemical is associated with a particular color, which appears only when that chemical present in sufficient concentrations. Further, the speed of reactions differ, leading to shifting concentrations of the coupled pair.  As concentrations rise and fall, we see the emergent and oscillating color arrays.


 

Governing Features ↑

Power Laws

Complex System behaviors often exhibit power-laws: with a small number of system features wielding a large amount of system impact.

Power laws are particular mathematical distribution that appear in contexts where a very small number of system events or entities exist that, while rare, are highly impactful, alongside of a very large number of system events or entities exist that, while plentiful, have very little impact. Power laws arise in both natural and social system, in contexts as diverse earthquake behaviors, city population sizes, and word frequency use.

'Normal' vs 'Power Law' Distributions

Complex systems are often characterized by power law distributions. A power law is a kind of mathematical distribution that we see in many different kinds of systems. It has different properties from a well known distribution  - a 'bell curve' 'normal' or 'Gaussian' distribution.

Let's look at the two here:


Power-law (left) vs Bell-curve (right)

Most people likely remember the bell curve from high school. The fat middle (highlighted) is the 'norm' and the two sides or edges represent the extremes. Accordingly, a bell curve can illustrate things like people's heights - with 'typical' heights being distributed around a large middle, and extreme heights (both very tall and very short people), being represented by much smaller numbers at the extremes. There are many, many, phenomena that can be graphed using a bell curve. It is suitable for depicting systems that hover around a normative 'middle', and for systems where there are no driving correlations amongst members of the set. In other words, the height of one person in classroom is not constrained or affected by heights of other people.

Power-law distributions are likely as common as bell-curve distributions, but for some reason people are not as familiar with them. They occur in systems where there is no normative middle where most phenomena occur. Furthermore, entities within a power-law set enjoy some kind of relations amongst them - meaning that the size of one entity in the system is in someway correlated with, (or has an impact) on the size and frequency of other entities. These  systems are characterized by a small percent of phenomena or entities in the system, accounting for a great deal of influence or system impact.

This small percent is shown on the far left hand side of the diagram (highlighted), where the 'y' axis (vertical) indicates intensity or impact (of some phenomena), and the 'x' axis indicates the frequency of events, actors, or components associated with the impact. The left hand side of the diagram is sometimes called the 'fat head', and as we move along to the right hand side of the diagram, we see what is called 'the long tail'. Like the bell curve, which we can use to chart phenomena such as housing prices, heights, test scores, or household water consumption, the power law distribution can illustrate many different kinds of things. 

Occasionally, we can illustrate the same phenomena using bell curves and power law distributions, while simultaneously highlighting different aspects of the same phenomena.

Example:

Let's say we chart income levels on a bell curve. The majority of people earn a moderate income, and a smaller number of people earn both very high and the very low incomes at the extreme sides. Showing this data, we get a chart that looks like the one below:

Wealth in the USA plotted as a bell curve (source: pseudoerasmus)

But, we can think of income distribution another way - the impact or intensity of incomes. Consider this fact of wealth distribution: in the US, if we look at the right side of the bell curve above (the wealthiest people who make up a small fraction or 1% of the population) these few people control around 45% of entire US wealth. Clearly, the bell curve does not capture the importance of this small fraction of extreme wealth holders.

Imagine that instead of plotting the number of people in different income brackets we were to instead plot the intensities of incomes themselves. In this case we would generate a plot showing:

  • 1% (a few people)  controlling  45% (a large chunk) of total wealth;
  • 19% (a moderate number of people) controlling  35% ( a moderate chunk) of total wealth;
  • 80% (the bulk of the population) controlling 20% (a small fraction) to total wealth.

These ratios plot as a power law, with approximately 20% of the people controlling 80% of the wealth resource.

80/20 Rule

These numbers, while not precisely aligning with US statistics, are not that far off, and they align with what is referred to as the '80/20' rule: where 20 percent of a system's components are responsible for 80 percent of the system's key functions or impacts. This might refer to many different kinds of things - quantities, frequencies, or intensities. Thus:

  • 20% of our wardrobe is worn 80% of the time;
  • 20% of all English words are used 80% of the time;
  • 20% of all roads attract 80% of all traffic;
  • 20% of all grocery items account for 80% of all grocery sales;

Finally, if we smash a frozen potato against a wall and sort out the resulting broken chunks:

  • 20% of the potato chunks will account for 80% of the total smashed potato.

Such ratios are so common that if you are unsure of a statistic then - provided it follows the 80/20 rule - you are likely safe to make it up! (the frozen potato being a case in point :))

Source: themediaconsortium.org

Rank Order

Another way to help understand how power law distributions work is to consider systems in terms of what is called their 'rank order'.  We can illustrate this with language. Consider a few words from English:

  • 'The' is the most commonly used word in the English language -
    • We rank it 'first' and it accounts for 7% of all word use (rank 1) .
  • "Of" is the second most commonly used word -
    • We rank it 'second' and it accounts for 3.5% of all word use (1/2 of the rank 1 word)

If we were to continues, say looking at the 7th most frequently used word, we would expect to see it use 1/7th as frequently as the most commonly used word. And in fact -

  • 'For' is the seventh most commonly word,
    • We rank it seventh  and it accounts for 1% of all word use  (1/7 of the rank 1).

This is perhaps the most straightforward ratio driving a power-law,  known as 'Zipf's Law' for George Kingsley Zipf, the man who first idenitified it. Zipf's law indicates that if, for example,  you have 100 items in a group, the 99th item will occur 1/99th as frequently as the first item.  For any element in the group, you simply need to know its rank in the order - 1st, 3rd, 25th - to understand its frequency (relative to the top ranked item in the group).

The constant in Zipf's law is '1/n' , where the 'nth' ranked word in a list is used 1/nth as often as the most popular word.

Were all power-laws to follow a zip's law then:

  • the 20th largest city would be 1/20th the size of the largest;
  • the 10th most popular child's name would be used 1/10 of the time compared to the most popular;
  • the 3rd largest earthquake in California in 100 years would be 1/3 of the size of the largest;
  • the 50th most popular product would sell 1/50th as often as the most popular .

This is a very easy and neat set, and it is represents perhaps the most straightforward power law.  That said, there can be other power law ratios between elements which, - while remaining constant, are not always such a 'clean' constant.  These follow the same principle but arejust are just more difficult to calculate. For example"

'1.07/n'  would be a power-law where the 'nth' ranked word in a list is used 1/1.07 times as often as the most popular word.

Pretty in Pink

Clearly '1.07/f' is a less satisfactory ratio then 1/n. In fact, the 1/n ratio is so pleasing that it has a few different names. 1/n is mathematically equivalent to 1/f ratio where, but instead of highlighting the rank in the list, 1/f highlights the frequency of an element in a list (the format is different but the meaning is the same).

'1/f' is also described as 'pink noise' - which is a statistical pattern distinct from 'brown' or 'white' noise. Each class of 'noise' pertains to different kinds of randomness in a system. In other words, while many systems exhibit random behaviors, some random behaviors differ from others. We can think of 'pink' 'white' and 'brownian' being different 'flavors' of randomness. Without getting into too much detail here (more clarification can be found on the {{pink-noise}} page), 1/f noise seems to occur frequently in natural systems, and are also associated with beauty. Without going into the mathematics of pink noise, it can be described as a frequency ratio of component distributions such that there is just enough correlation between elements to provide a sense of unity, and just enough unexpectedness to provide variety.

Dynamics generating Power-laws

Power laws distributions have been identified in many complex system behaviors, such as:

  • earthquake size and frequency
  • neuron activity
  • stock prices
  • web site popularity
  • academic citation network structure
  • city sizes
  • word use frequency
  • ....and much more!

Much time and energy has gone into identifying where these distributions occur and also trying to understand why they occur.

A strong contender for the presence of power-law dynamics is that they may be the result of systems that involve both growth and Preferential Attachment. Understood colloquially as 'the rich get richer', preferential attachment is the idea is that popular things tend to attract more attention, thereby becoming more popular. Similarly, wealth begets wealth.  The idea of growth and preferential attachment is therefore associated with positive feedback. It can be used to explain the presence of power-law distributions in the size and number of cities (bigger cities attract more industry thereby attracting more people...) the distributions of citations in academic publishing (highly cited authors are read more thereby attracting more citations), and the accumulation of wealth (rich people can make more investments, thereby attracting more wealth).

Further, power-laws might be understood as a phenomena that occur in systems that involve both positive and negative feedback as interacting and co-evolving drivers operating amongst the entities within the system. Such systems would involve feedback dynamics that are out of balance: some feedback dynamics (positive) are amplifying certain system features, while others system dynamics (negative) are simultaneously  'dampening' or constraining these same system features.Their is a correlation between these push and pull dynamics - so the greater the push forward the more it generates a pull back, and vice versa. The imbalance between this push and pull interplay between interacting forces creates feedback loops that lead to power-law features.

An example of this would be that of reproducing species in an eco-system with limited carrying capacity. Plentiful food would tend to amplify reproduction and survival rates (positive feedback), but as population expands this begins to put pressure on the food resources, leading to a push back (lower survival rates), and consequently a drop in population levels. The two driving factors in the system -  growing population and dwindling food - are causally entwined with one another and are not necessarily in balance. If the system achieves a perfect balance then the system will find an equilibrium - the reproduction rate will settle to a point where it matches the carrying capacity. But if there are forces that drive the system out of balance, or if there is a lag time between how the two 'push' and 'pull' (amplifying and constraining) dynamics interact, then the system cannot reach equilibrium and instead keeps oscillating between states (see {{Bifurcations}}. 


Example

It has been shown that the frequency of baby name occurrences follows a power-law distribution. In this example, what is the push/pull interplay that might lead to the emergence of this regularity? While each set of parents chooses their child's name independently, they do so within a system where their choices are somewhat driven or constrained by the choices being made by parents around them. Suppose there is a name that, for some reason, has become prevalent in popular consciousness - perhaps a character name in a popular book or tv series.  It is not necessary to know the precise reasons why this particular name becomes popular, but we can imagine that certain names seem to resonate in popular consciousness or 'the zeitgeist'. Let us take the name 'Jennifer'. An obscure name in the 1930s,  it became the most popular girl's name in the 1970s. During that time, if you were one of the approximately 16 million girls born in the US, there was  a 2.8% chance you would be named Jennifer !  And yet,  the name had plummeted back to 1940s levels by the time we get to 2012.

the rise and fall of Jennifer

But how can the rise and fall of 'Jennifer' be described using push and pull forces? We can imagine a popular name being like a contagion, where a given name catches on in popular consciousness. While spreading, the name it brought even further into popular consciousness, potentially expanding its appeal.  At the same time, the very fact that the name is popular causes a tendency for resistance - if Jennifer is on a short list of possible baby names, but a sibling or close friend names their child 'Jennifer', this has an impact on your naming choice. In fact, the more popular the name becomes, the more pullback we can expect. As more and more people tap into the popularity of a name, it becomes more and more commonplace, leading to a sense of overuse, leading to a search for new novelty. The interactions of push and pull cause the name to both rise and fall. In a system of names, Jennifer is a name that had an expansion rate caused by rising popularity feedback, but then a decay rate caused by overuse and loss of freshness.

Overuse?

Again, these dynamics find their expression in the ratios associated with power laws. The distribution is always characterized by a small number of system components wielding a high degree of system energy or impact. It should, however,  be noted that power laws are not without controversy: while power laws are often described as 'the fingerprint of complexity', some argue that the statistics upon which they are based are often skewed, and that power-laws may not be as common in systems as is frequently stated.

Feedback Dependencies

.

Example

Example

Because of their contributions to our understanding of  power law distributions (Zipf for word frequency and Pareto for city size), we sometimes call such power laws 'Zipf' or 'Pareto' distributions.  (for more see: George Kingsley Zipf and Vilfredo Pareto ).


 

Governing Features ↑

Path Dependency

'Path-dependent' systems are ones where the system's history matters - the present state depends upon random factors that governed system unfolding.

Related Ideas and Terms: Arrow of Time Contingency Sensitive to Initial Conditions History Matters


Inherent vs Contingent causality

Why is one academic cited more than another, one song more popular than another, or one city more populated than another? We tend to imagine that the reason must have to do with inherent differences between academics, songs or cities. While this may be the case, the dynamics of complex systems may lead one to doubt such seemingly common-sense assumptions.

We describe complex systems as being non-linear - this means that small changes in the system can have cascading large effects - think the butterfly effect - but what it also implies is that history, in a very real way matters. If we were to play out the identical system with very slight changes, the specific history of each system would play a tangible role in what we perceive to be significant or insignificant.

Think about a cat video going viral. Why this video? Why this particular cat? If on a given day 100 new cat videos are uploaded, what is to say that the one going viral is inherently cuter than the other 99 out there? Perhaps this particular cat video really is more special. But a complexity perspective might counter with the idea of path-dependency: that amongst many potentially viral cat videos, a particular one played  this potentiality out - but this is an accident of a specific historical trajectory, rather than a statement about the quality of the video itself.

Butterfly Effects:

The reason for this returns to the idea of the non-linearity of the system. Suppose we have six cat videos that are of inherently equal entertainment value. All are posted at the same time. We now roll a six sided die to determine which of these gets an initial 'like'.  This initial roll of the die now causes subsequent rolls to be slighted weighted - whatever received an initial 'like' has a fractionally larger chance of being highlighted in subsequent video feeds. Let us assume that subsequent rolls reinforce, in a non-linear manner, the first 'like'. Over time, like begets like, the rich get richer, and we see one video going viral.

If we were to play out the identical scenario in a parallel universe, with the first random toss of the dice falling differently, then an entirely different trajectory would unfold. Such is the notion of 'path-dependency'. Of course, it is normal to assume that given the choice of two pathways into an unknown future,  the path we take matters, and will change outcomes. But in complex systems this constitutes an inherent part of the dynamics, and a 'choice' is not something that one actively elects to make,  as much as something that arises due to random system fluctuations.

Another way to think about this is with regards to the concept of Phase Space. Any complex system has a broad state of potential trajectories (its phase space), and the actualization of any given trajectory is subject to historical conditions. Thus, if we want to understand the dynamics of the system, we should not only attune to the path that actually unfolded - rather we should consider the trajectories of all possible pathways. This is because the actual unfolding of any given pathway within a system is not inherently more important then all of the other pathways that did not happen to unfold.

One of the reasons that computer modeling is popular in understanding complex systems has to do with this notion of phase space and path dependency. A computer model allows us to 'explore the phase space' of a complex system: seeing if system trajectories are inherently stable and repeat themselves consistently, or if they are inherently unstable and might manifest in all kinds of different ways.

Sometimes we can imagine that a system does unfold differently in phase space, but that this unfolding tends towards particular behaviors. We then say that the system has an {{attractor}} HANDLEBAR FAIL. One of the features of complex systems is that they often have multiple attractors, and it is only by allowing the system to unfold that we are able to determine which attractor the system ultimately converges towards. It would be a mistake, however, to grant a particular attractor as being more important than another based only upon one given instance of a system of unfolding.

Another feature of path dependency is that, once a particular path is enacted, it can be very difficult to move the system away from that pathway, even if better alternatives exist.

A great example of path dependency is the battle between VHS and BETA as competing video formats. According to most analysts, BETA was the superior format, but due to factors involving path dependency, VHS was able to take over the market and squeeze out its superior competitor.

Another example is that of the QWERTY key board. While initially a solution to the problem of keys jamming when pressed too quickly on a manual keyboard, the solution actually slows down the process of typing. However, even though we have long since moved to electronic and digital keyboards where jamming is not a factor, we are 'stuck' in the attractor space that is the QWERTY system. This is partially due to the historical trajectory of the system, but also all of the reinforcing feedback that works to maintain QWERTY: once people have learnt to type on one system, it is difficult to instigate change.

An Urban example may also be instructive here: In Holland people bike as a normal mode of transport, in North America they drive. We can make arguments that there are inherent differences in North American and Dutch cultures that create these differences, but a complexity argument might propose, instead, that such differences are due to path-dependency. Perhaps any preferences that the Dutch have for biking are only random. That being said, over time, infrastructure has been created in the Netherlands that incentivizes biking (routes everywhere), and disincentives driving (many streets closed to traffic, lack of parking, inconvenient commutes). In North America, we have created infrastructure that in incentives driving: big streets, huge parking areas close to where we work, and lack of other transport alternatives. We then arrive at a situation where the dutch bike and the north-american drives. But place a North American in Holland and they will soon find themselves happily biking, and place a Dutchman in the USA and they will soon find themselves purchasing a vehicle to drive along with everyone else. Neither driving or biking is inherently 'better' in so far as the commuter is concerned (although there may be more environmental and health benefits associated with one versus the other), but the pathways each country have taken wind up mattering, and reinforcing behaviors through feedback systems.

If we are able to better understand how to break out of ill-suited path-dependency, we may be able to solve a variety of problems that seem to be 'inherent' or 'natural' choices or preferences.

some discussion here about the butterfly effect, and lorenz ... Tie this back to Sensitive to Initial Conditions


 

Governing Features ↑

Open / Dissipative

Open & dissipative systems, while 'bounded' through with primarily internal interactions,  nonetheless exchange energy with with their external environment.

why does a lobster not eat itself?


A system is considered to be open and dissipative when energy or inputs can be absorbed into the system, and 'waste' discharged. Here, system inputs like heat, energy, food, etc., can traverse the open boundaries of the system and ‘drive’ it towards order: seemingly in violation of the second law of thermodynamics.

In this case, order is achieved within the boundaries of the system because the , disorder of the system is able to dissipate into the surrounding context. Local order (within the system) is thus maintained at the expense of global disorder (within the system and its surrounding context). Were the system to be fully closed from its context, it would be unable to maintain this local order.

Example:

A basic example here is found in the example of Benard/Rayleigh convection Rolls (which is often used when examining complex system behavior). In this example,  we have a fluid in a small Petri dish, heated by a source placed under the dish. The behavior of the fluid is the system that we wish to observe, but this system is not closed: it is subject to the input of heat that traverses the boundary of the Petri dish. Further, while heat can 'get in' to the system, it can also be lost to the air above as the fluid cools. Note that the overall system clearly has a defined 'inside'  (the fluid in the Petri dish), and a defined 'outside' (the surrounding environment and the the heat acting upon the Petri dish), but there is not full closure between the inside and outside. This is what is meant when we say that complex systems are 'open'. We understand them as bounded, (with relations primarily internal to that boundary), but they nonetheless interact in some way with their surroundings. Further, because of this openness, complex systems are able to dissipate their entropy or disorder (which always increases), and export it to the outer side of their boundary. This dissipation of entropy is what allows for an increase of order within the system boundary. Were the boundary fully closed such increase in order could not occur.

Let us turn to the flows driving the system. As heat is increased, the energy of this heat is transferred to the fluid, and the temperature differential between the top and the bottom of the liquid medium causes heated molecules to be driven upwards. At the same time, the force of gravity causes the cooler, heavier molecules to be driven downwards. Finally, the drag forces acting between rising and falling molecules cause their behaviors to become coordinated, resulting in 'roll' patterns associated with Benard convection.

Rayleigh/Benard Convection (fluid of oil/ silver paint)

The roll patterns that we observe are a pattern: a global structure that emerges from the interactions of many agitated molecules without being 'coordinated' by them. What helps drive this coordination is the dynamics of the interacting forces that the molecules are subjected to (heat flows and gravity pressures), as well as how the independent molecular responses to these pressures tend to reinforce one another (through the drag forces exerted between molecules).  That said, the molecules in the fluid solution do nothing on their own, absent the input of heat. Instead, heat is the flow that drives the system behavior. Further, as the intensity of this flow is amplified (more heat added), the behavior of the fluid shifts from that of regular roll patterns to more turbulent patterns.

related terms:


 

Governing Features ↑

Networks

Network theory allows us think about how the dynamics of agent interactions in a complex system can affect the performance of that system.

Network theory is a huge topic in and of itself, and can be looked at on its own, or in relation to complex systems. There are various formal, mathematical ways of studying networks, as well as looser, more fluid ways of understanding how networks can serve as a structuring mechanism.


We can think of networks in fairly simple terms: imagining, for example, a network of aircraft traveling between hubs and terminals, or a network of people working together in an office. In each example, we can imagine that,by looking at the network alone, we can deduce something about the entities that comprise the network. For example, the image below could illustrate many different kinds of networks: perhaps it is an amazon delivery network, or a social network, or an academic citation network. What is interesting is that, even without knowing anything about the kind of network it is, we can still say some things about how it is structured. The network below has some pretty big hubs - around 6 of them that are well connected to other nodes, but not strongly connected to one another. What would be the dynamics of this network if it were a social network, or a network of a company?

What might we learn from the network?

By looking at the diagram we might learn about how information or control is exerted, about how connected entities are, and about how protracted communication channels might be. A work network in which I need to talk to my superior, who in turn talks to his boss, who in turn is one of three bosses who only talk to each other, creates very different dynamics than a network where I have connections to everyone, or where there is only one chain of command rather than three.

Network theory attempts to understand how different kinds of networks might lead to different kinds of system performances. It uses domain specific language - speaking of nodes, degree centrality, edges, etc. - and much of this detail falls outside of the scope of this website.

What is important is that complex systems are made up of individual entities and, accordingly, the ways in which these entities relate to one another matter in terms of how the whole is structured. Networks in complex adaptive systems are composed of individual agents, and the relationships between these agents tend to naturally evolve in ways that lead to power law distributions between highly and weakly connected agents. This is due to the dynamics of Preferential Attachment whereby 'the rich get richer'.

At its most extreme, network theory advances the idea that relationships between objects can have primacy over the objects themselves. Here, the causal chain is flipped from considering objects or entities as being the primary causal figure that structures relationships, to instead exploring how relationships might in fact be the primary driver that act to structure objects or entities.

In the social sciences, systems theory (developed by Ludwig V. Bertalanffy), was the first to endeavor to examine how networks could play a structuring role. Systems theory was considered to be a meta-framework that could be applied in disparate domains - including  physics, biology, and the social sciences - and attracted a wide following. Rather than focusing upon the atomistic properties of the things that make up the system, systems theory instead attuned to the relationships that joined entities, and how these relationships were structured.

Gregory Bateson, illustrates this point nicely when he considers the notion of a  hand: he asks, what kind of entity are we looking at when considering a hand? The answer depends on one's perspective: We can say we are looking at five digits, and this is perhaps the most common answer (or four fingers and a thumb). If we look at the components of the hand in this manner, we remain focused on the nature of the parts - we might look at the properties of each finger and how these are structured. However, we can answer the question another way: instead of seeing five digits we can say that we see four relationships. Bateson's point was that the way in which the genome of an organism understands or structures the entity 'hand' is more closely aligned with the notion of relationships rather than that of digits or objects. Accordingly, if we are to better understand natural entities we should begin to examine these from the perspective of relations rather than objects.

“You have probably been taught that you have five fingers. That is, on the whole, incorrect. It is the way language subdivides things into things. Probably the biological truth is that in the growth of this thing – in your embryology, which you scarcely remember – what was important was not five, but four relations between pairs of fingers.” - Gregory Bateson

In a similar vein, Alan Turing (father of the computer!) tried to understand the different fur patterns that are seen on animals (spots, patches or lines), as being different outer manifestations of a common driving mechanism  - where shifting the timing and intensities of the relationships of the driving mechanism result in shifts in which pattern manifests. Rather than thinking of these distinctive markings as things 'in and of themselves' Turing wanted to understand how they might simply be different manifestation of more fundamental driving relationships.

Turing based his ideas on a reaction/diffusion model showing how shifting intensities of chemical relationships could create different distinct patterns.

Network theory is important in complexity thinking because of how the structure of the network can affect the way in which emergence occurs: certain dynamics manifest in more tightly or loosely bound networks, and information, a key driver of complex processes, moves differently depending on the nature of a given network.

See also:

{{Small-World-Networks}} HANDLEBAR FAIL

Preferential Attachment


 

Governing Features ↑

Iterations

CAS systems unfold over time, with agents continuously adjusting behaviors in response to feedback. Each iteration moves the system towards more coordinate, complex behaviors.

This is a default subtitle for this page.


One of the keys to complex adaptation occurring is the ability for the system to manifest emergent behaviors as a result of feedback.

There are a series of ways in which we can think about this feedback, each of which tie in with the concept of iterations. In all cases, we see emergent phenomena arising over the course of multiple iterations. However, each case differs in terms of how tightly coupled the emergent phenomena is with the notion of 'learning'. Thus, we can understand emergent global features as a result of the following kinds of dynamics:

- produced solely due to the interaction of a rule system acting upon itself over multiple iterations - without regard to 'learning' or 'fitness' (example: Fractals,{{game-of-life}} HANDLEBAR FAIL );

- produced solely by virtue of natural, physical laws that, when enacted in a system composed of multiple agents sharing a common context AND with other agent behavior forming a feature of this context. Agents, over iterative adjustments, gradually coallesce towards regimes that minimize unnecessary energy expenditure in the system (example: metronomes on sliding platform going into sync, {{benard-rolls}} HANDLEBAR FAIL forming in heated liquid).

- produced solely by virtue of static rule regimes that steer agent behavior in a system composed of multiple agents sharing a common context AND with other agent behavior forming a feature of this context). Agents, over iterative adjustments,  gradually coallesce towards regimes that,  while not necessarily the most efficient, are the best available to the system based on the global knowledge available (example: ants forming ant trails, flocking behaviors).

Finally, we have the last concept, where emergent global features are most closely produced by learning, such that:

- by iteratively applying rule regimes and then evaluating the effectiveness of these regimes in light of feedback, agents evolve their rule regimes to better align their response to inputs with outputs regime that effectively achieve a particular goal with maximum effectiveness for minimum costs (Darwinian evolution, Firm Competition).

Let us look at the first case, a rule system acting upon itself over multiple iterations. These kinds of examples are often found in artificially generated systems that exhibit complex features. Take for example the koch curve:

First four iterations of the Koch Curve

We begin with a line and a simple three step rule protocol:

- for every line segment break it into three equal segments;

- Form an equilateral triangle wherever the center segment falls (removing the base)
- Repeat this process for each of the newly created line segments.

As seen in the image above, over multiple iterations a high degree of complexity is generated. The same principles are at work in creating all fractal forms - basic instructions generate an output, whose new properties (going from one line segment to four line segments), become the new context upon which to re-apply the rule.

While we do get a lot of richness from such phenomena, it would be incorrect to state that the resulting figure 'adapts' or 'learns' - it simply 'unfolds' over multiple iterations.

Similarly, we can create a very simple agent-based model, a grid composed of cells that follow basic rules that turn a given cell on the grid either 'on' or 'off' and let it unfold step by step. As long as the behavior at each time step is predicated on the outcome generated at the previous time step, we are able to create incredibly interesting phenomena. Conway's Game of Life (see {{rules/schemata}} is a prime example of complexity generated by such simple rules, that, when repeated over multiple time steps yields highly complex behaviors.

This famous example of emergent complexity is entitle 'the game of life', but is it really life? While the emergent outcomes of the automata are rich in variety, can we say that the system truly adapts or learns? One feature of the output is that some patterns create iterative loops, which reproduce the identical patterns - meaning that once these forms emerge they can reproduce themselves. If proliferation within the grid of the game is considered to be a form of higher evolution, then this might be seen as a form of adaptation.

Game of Life from Wikimedia Commons

Other kinds of complex systems are formed of co-mingled agents that are simply behaving in accordance with the energetic forces they are interacting with. Thus certain chemical reactions will...

to We also see see systems that are composed of static rules, but where these rules

What is truly interesting in complex adaptive systems is the way in which a rules can adjust over multiple generations. In order for this to occur, agents within the system need to have a kind of goal - an outcome that is preferred over others. In the example of the koch curve above, we cannot say that the line segments have a 'goal' of creating a particular kind of shape that is preferred over any other. The system has an emergent quality, but this emergent is in no way purposeful in and of itself.

But in CAS, the agents do have desired outcomes or more 'fit' states. Fitness is a useful concept here, because 'goal' implies volition on the part of the agents. This is fine for some kinds of agents - an ant has the goal of finding food - but not all. Water molecules don't have goals, but they can enter into more 'fit' behavioral states in the context with their environment.

Iterative complex systems, where agents shift their according to

We  . We can think of these goals as The word 'goal' can be problematic since Each time a rule is enacted within a given context, the results of that enactment change the context. In a complex adaptive system the comparison of the new context with the rule


 

Governing Features ↑

Information

What drives complexity? The answer involves a kind of sorting of the differences the system must navigate. These differences can be understood as flows of energy or information.

Complex Systems are ones that violate the second law of thermo-dynamics: that is to say, order manifests out of disorder. Another way to state this it that such systems are ones where order (negentropy), increases over time. This is counter to the second law which states that, left to its own devices, a system's disorder (entropy) will increase. Thus, we expect that, over time, buildings break down, and that a stream of cream poured into coffee will dissipate. We don’t expect a building to rise from the dust, nor a creamy cup of coffee to partition itself into distinct layers of and cream and coffee. Yet similar forms of unexpected order arise in complex systems.


PART I: Understanding Information

Shannonian Information

An important way of thinking about this increase in order relates to concepts based in information theory.  Information theory, as developed by Claude Shannon, evaluates systems based upon the amount of information or 'bits' required to describe them.

Shannon might ask, what is the amount of information required to know where a specific particle of cream is located in a cup of coffee? Further, in what kinds of situations would we require more or less information to specify a location?

Example:

In a mixed, creamy cup of coffee, any location is equally probable for any particle of cream. We therefore have maximum uncertainty about location:  the situation has high entropy, high uncertainty, and requires high information content to specify a location.  By contrast, if the cream and coffee were to be separated (say in two equal layers with the cream at the top) we would now have a more limited range of locations where a cream particle might be placed. Our degree of uncertainty about the cream's location has been reduced by half, since we now know that any particle of cream has to be located somewhere in the upper half of the cup - all locations at the bottom of the cup can be safely ignored.

Information vs Knowledge

Counterintuitively, the more Shannonian information required to describe a system, the less knowledge we have of it.  Hence, as a system become more differentiated and ordered - or as emergence features arise - its level of Shannon information diminishes.

This, in a way, is unfortunate: our colloquial understanding of 'having a lot of information', pertains to us knowing more about something. Thus, seeing a cup of coffee divided in cream and coffee layers, we perceive something with more structure, more logic, that we might assume conveys more information to us (that coffee and cream are different things). A second, stirred cup appears more homogenous – it has less structure or organization. And yet, it requires more Shannon information to describe it.

A difficulty thus lies in how we tend to consider the words ‘disorder’and ‘information’. We associate disorder with lack of knowledge (or lack of information) and order with knowledge and, therefore, more information).

While intuitively correct, unfortunately this is not how things works from the perspective of information and communication signals - which is what Shannon was concerned with when formulating his ideas.  Shannon was trying to understand the bits of information are required to convey the state of a system (or a signal).

Example:

Imagine I have an extremely messy dresser and I am looking for my favorite shirt. I open my dresser drawers and see a jumble of miscellaneous clothes: socks, shirts, shorts, underwear. I rifle through each drawer examining each item to see if it, indeed, is the shirt I am seeking. To find the shirt I want (which could be anywhere in the dresser), I require maximum information, since the dresser is in a state of maximum disorder.
Thankfully I spend the weekend sorting through my clothes. I divide the dresser by category, with socks, shirts, shorts and underwear drawers. Now, if I wish to find my shirt,  my uncertainty about its location has been reduced by one quarter (assuming four drawers in the dresser). To discover the shirt in the dresser's more ordered state requires less information:  I can limit myself to looking in one drawer only.

Let us take the above example a little further:

Imagine that I love this particular shirt so much that I buy 100 copies of it, so many that they now fill my entire dresser. The following morning, upon waking, I don't even bother to turn on the lights. I reach into a drawer (any drawer will do), and pull out my favorite shirt!

My former, messy dresser had maximum disorder (high entropy), and required a maximum amount of Shannon Information ('bits' of information to find a particular shirt).  By contrast, the dresser of identical shirts, has maximum order (negentropy), and requires a minimal amount of Shannon Information (bits) to find the desired shirt.

Interesting information:  States that matter!

It should be noted that the two extreme states illustrated above are both pretty uninteresting. A fully random dresser (maximum information) is pretty meaningless, but so is a dresser filled will identical shirts (minimum information). While each are described by contrasting states of Shannonian information, neither maximum nor minimum information systems appear very interesting.

One might also imagine that neither the random nor the homogeneous systems are all that functional. A dresser filled with identical shirts does not do a very good job of meeting my diverse requirements for dressing (clothing for different occasions or different body parts), but my random dresser, while meeting these needs, can't function well because it takes me forever to sort through.

Similarly, systems with too much order cannot respond to a world filled with different kinds of situations. Furthermore, they are more vulnerable to system disruption. If you have a forest filled with identical tree species, one destructive insect infestation might have the capacity to wipe out the entire system. If I own 100 identical shirts and it goes out of style, I suddenly have nothing to wear.

Meanwhile, if everything is distributed at random then functional differences can't arise: a mature forest eco-system has collections of species that work together, processing environmental inputs in ways that syphon resources effectively - certain species are needed moreso than others. In my dresser, I need to find the right balance between shirts, socks, and shorts: some things are worn more than others and I will run into shortages of some, and excesses of others, if I am not careful.

PART II:  Information Sorting in Complex Systems

Between Order and Disorder

It appears that in order to be responsive to a world that consists of different kinds of inputs, complex systems tune themselves to information states involving just enough variety (lots of different kinds of clothes/lots of different tree species) and just enough homogeneity (clusters of appropriately scaled groups of clothing or species). These systems violate the second-law of thermodynamics (gaining order), but not gaining so much order as to become homogenous.

Decrease in Shannonian information = decrease in uncertainty

Imagine we have a system looking to optimize a particular behavior - say an ant colony seeking food. We place an assortment of various-sized bread crumbs on a kitchen table, and leave our kitchen window open overnight. Ants march in through the window, along the floor, and up the leg of the table.

Which way should they go?

From the ants perspective, there is maximum uncertainty about the situation: or maximum Shannonian information. The ants spread out in all directions, seeking food at random. Suddenly, one ant finds food, and secretes some pheromones as he carries it away. The terrain of the table is no longer totally random: there is a signal - food here! Nearby ants pick up the pheromone signal and, rather than moving at random, they adjust their trajectories. The ant's level of uncertainty about the situation has been reduced or, put another way, the pheromone trail represents a compression of informational uncertainty - going from 'maximum information required' (search every space), to 'reduced information required' (search only spaces near the pheromone trace).

If all ants had to independently search every square inch of tabletop to find food, each would require maximum information about all table states. If, instead, they can be steered by signals deployed by other ants, they can limit their search to only some table states. By virtue of the collective, the table has become more ‘organized’ in that it requires less information to navigate towards food. There is a reduction of uncertainty, or reduction of 'information bits', associated with finding the location of 'food bits'. Accordingly, these are more easily discovered.

Sorting a system so there is less to sort:

Suppose we are playing 20 questions.  I am thinking of the concept ‘gold’, and you are required to go through all lists of persons, places and things in order to eventually identify ‘gold’ as the correct entity. Out of a million possible entities that I might be thinking of, how long would it take to find the right one in a sequential manner? Clearly, this would involve a huge length of time. The system has maximum uncertainty (1 million bits), and each sequential random guess reduces that uncertainty by only 1 bit (999,999 bits to go after the first guess!). While I might 'strike gold' at any point, the odds are low!

From an information perspective, we can greatly reduce the time it takes to guess the correct answer if we structure our questions so as to minimize our uncertainty at every step. Thus if I have 1,000,000 possible answers in the game ‘twenty questions’, I am looking for questions that will reduce these possibilities to the greatest extent at each step. If, with every question, I can reduce the possible answers in half (binary search) then, within 20 question, I can generally arrive at the solution. In fact the game, when played by a computer, can solve for any given entity within an average of six guesses! With each guess, the degree of uncertainty regarding the correct answer (or shannonian information), is reduced.

Reduce information | Reduce effort

Another way to think about that is that, as Shannon Information is reduced, the system can channel its resources more effectively – that is, focus on work (or questions) that move towards success while expending less wasted effort.

This may be the reason why, in complex systems,  we often observe the phenomena of growth and preferential attachment.

To illustrated, imagine that I wish to move to a new city to find a job. I can choose one of ten cities, but other then their names, I know nothing about them, including their populations. I relocate at random and find myself in a city of 50 people, with no job postings. My next random choice might bring me to a bigger center, but, without any information, I need to keep re-locating until I land in a place where I can find work.

If, instead, the only piece of information that I have is the city's populations, I can make a judgement: If I start off my job hunt in larger centers then there is a better chance that jobs matching my skills will be on offer. I use the population sizes as a way to filter out certain cities from my search - perhaps with a 'rule' stating that I won't consider relocating to cities with less than 1 million inhabitants. This rule might cross out six cities from my search list, and this 'crossing out' is equivalent to reducing information bits required to find a job: I can decide that my efforts are better spent focusing on a job search in only four cities instead of ten.

By now it should have become clear that this is equivalent to my looking for a given cream particle in only half the coffee cup, or ants looking for food only on some parts of the table, or my search in 20 questions being limited only to items in the 'mineral' category.

All of these processes involve a kind of information sorting that gives rise to order, which in turn makes things go smoother: from random cities to differentiated cities; from random words to differentiated categories of words.

What complex systems are able to do is take a context that, while initially undifferentiated, can be sorted by the agents in the system such that the agents in the system can navigate through it more efficiently. This always involves a violation of the second law of thermodynamics, since the amount of shannonian information (the entropy or disorder of the system), is reduced. That said, this can only occur if there is some inherent imbalance in the system, or 'something to sort' in the first place. If a context is truly homogeneous (going back to our dresser of identical shirts), then no amount of system rearranging can make it easier to navigate. Note that an undifferentiated system is different from a homogenous system. A random string of letters is undifferentiated; a string of composed solely of the letter 'A' is homogeneous.

Accordingly, complex systems need to operate in a context where some kind of differential is present. The system then has something to work with, in terms of sorting through the kinds of differences that might be relevant.

One thing to be very aware of in the above example, is how difficult it is to disambiguate information from knowledge. As we gain knowledge about probably system states, Shannonian information is reduced.  This is a frustrating aspect of the term ‘information’, and can lead to a lot of confusion.

This Christmas Story illustrates how binary search can quickly identify an entity

Negentropy


 

Governing Features ↑

Fitness

Complex Adaptive Systems become more 'fit' over time. Depending on the system, Fitness can take many forms,  but all involve states that achieve more while expending less energy.

What do we mean when we speak of Fitness? For ants, fitness might be discovering a source of food that is abundant and easy to reach. For a city, fitness might be moving the maximum number of people in the minimum amount of time. But fitness criteria can also vary - what might be fit for one agent isn't necessarily fit for another. For example, what makes a hotel room 'fit'? Is it location, or price, or cleanliness, or amenities, or all of the above?  For different people, these various factors or parameters have different 'weights'. For a backpacker traveling through Europe, maybe the price is the only thing worth worrying about, whereas for a wealthy business person it may not factor in at all.


Accordingly, the idea of fitness in any complex system is not necessarily a fixed point. There can be many different kinds of fitness, and we need to examine the system to determine what factors are at play.

That said, there are certain principles that remain somewhat more constant, and this pertains to the idea of minimizing processes. We can imagine that certain behaviors in a system require more or less energy to perform. If an ant wants to find food, it prefers to find a source that takes less time to get to than one that is further away. Further, a bigger source of food is better than a smaller source of food, as more ants in the colony can benefit. Complex systems generally gravitate towards regimes that therefore in some way minimize energy expenditure to achieve a particular goal. However, this depends on the nature of the goal.

Example: Returning to the example finding a hotel room, consider the popular website 'Airbnb' as a complex adaptive system. Here, two sets of bottom-up agents (room providers and room seekers) coordinate their actions in order to have useful room occupancy patterns to emerge. Some of these patterns might be unexpected. For example, a particular district in Paris might emerge as a very popular neighborhood for travelers to stay in, even though it is not in the center of the city. Perhaps it is just at a 'sweet-spot' in terms of price, amenities, and access to transport to the center. This is an example of an emergent phenomena that might not be predictable but nonetheless emerges over the course of time. In that case, rooms in that district might be more 'fit' than in another, because of these interacting parameters that are highly appealing to a broad swath of room-seekers.

So in what way is the above example 'energy minimizing'? We can think of the room seekers as having different packages of energy they are willing to expend over the course of their travel. One package might hold their money, one might hold their time, and one might hold their patience to deal with irritations (noisy neighbors that keep them from sleeping, or willingness to tolerate a dirty bathroom...). Each agent in the system is trying to manage these packets of energy in the most effective way possible to preserve them for other needs. So if a room is close to the center of the city, it might preserve time energy, but this needs to be balanced with preserving money energy.

We can begin to see that fitness is not going to come in a 'one size fits all' form. Some agents will have more energy available to spend on time, and others will have more energy to go towards money. Further, an agent in the system might be willing to spend much more money if it results in much more time being saved, or vice versa. We can imagine that an agent might reach a decision point where these two equally viable trajectories are placed in front of them. The choice of time or money might be likened to a flipping of a coin, but the resulting 'fit' regime might appear as very different.

In order to better understand these dynamics, two features of CAS, that of a Fitness Landscape and ideas surrounding Bifurcations, clarify how CAS can unfold in multiple fit trajectories, but despite these differences the underlying principles of energy minimizing holds true.

In the above example the agents (room seekers), employ cognitive decision-making processes to determine what a 'fit' regime is. But physical systems also gravitate to these energy minimizing regimes.

Example: When molecules in a soap bubble solution are subject to being blown through a soap wand, nobody tells them to form a bubble, and the molecules themselves don't consider this outcome. Instead, the bubble is the soap mixture's solution to the problem of finding a surface area that minimizes surface area and therefore frictions. The soap bubble can  therefore be considered as an energy minimizing emergent phenomena  (if you want a detailed explanantion, then follow the link to an article on the subject: note the phrase, 'a bubble's surface will minimize until the force of the air pressures within is equal to the 'pull' of the soap film'). We can also think of a sphere as being the natural Attractor States of a soap solution seeking to absorb maximum air with minimum surface - or doing the most with the least.

 

Governing Features ↑

Feedback

Feedback loops occur in system where an environmental input guides system behavior, but the system behavior (the output), in turn alters the environmental context.

This coupling between input affecting output affecting input creates unique dynamics and interdependencies between the two.


There are two kinds of feedback that are important in our study of complex systems: positive feedback and {{negative-feedback}}. Despite the value-laden character of these names, there is no value judgement regarding 'positive' (good) versus 'negative' (bad) feedback. Instead, the terms can more accurately be described as referring to reinforcing (positive) versus dampening (feedback). Reinforcing feedback can amplify slight tendencies in a system's behavior, whereas dampening feedback works to restrain any changes to system behavior.

We can think of a thermostat as a classic example of dampening (negative) feedback mechanisms at work.


 

Governing Features ↑

Far From Equilibrium

left to their own devices, systems tend towards orders that involve a minimum of energy energy expenditure: complex systems, which are able to channel continuous energy flows, operate far from equilibrium.

This is a default subtitle for this page.


In order to appreciate what we mean by 'far from equilibrium' we first need to start by understanding what is meant by 'equilibrium'. We can understand equilibrium using two examples: that of a pendulum, and that of a glass of ice cubes and water.

If we set a pendulum in motion, it will oscillate back and forth, slowing down gradually, and coming 'to rest' in a position where it hangs vertically downwards. We would not expect the pendulum to rest sideways, nor to stand vertically from its fulcrum point.

We understand that the pendulum has expended its energy and now finds itself in the position where there is no energy - or competing forces -  left to be expended. The forces exerted upon it are that of gravity, and this causes the weight to hang low. It has arrived at the point where all acting forces have been canceled out : equilibrium.

Similarly, if we place ice cubes in a glass of water, we initially have a system (ice and water) where the water molecules within the system have very different states. Over time, the water will cool slightly, while the ice will warm slightly (beginning to melt), and gradually we will arrive at a point in time where all the differences in the system will have cancelled out. Ignoring the temperature of the external environment, we can consider that all water molecules in the glass will come to be of the same temperature.

Again, we have a system where competing differences in the system are gradually smoothed out, until such time as the system arrives at a state where no change can occur: equilibrium.

In a complex system, we see very different dynamics: part of the strangeness of emergence arises from the idea that we might see ice spontaneously manifesting out of a glass of water! This is what we mean by 'far from equilibrium': systems that are constantly being driven away from the most neutral state (which would follow the second law of thermodynamics), towards states that are more complex or improbable. In order to understand how this can occur, we need to look at the flows that drive the system, and how these offer and ongoing input that pushes the system away from equilibrium.

Example:

Lets take a look at one of our favorite examples, and ant colony seeking food. Lets start 100 ants off on a kitchen table (we left them there earlier when we were looking at {{driving-flows}}. The ants begin to wander around the table, moving at random, looking for food. If there are crumbs on the table, then some ants will find them, and direct the colony towards food sources through the intermediary signal of pheromones. As we see trails form (a clear line forming out of randomness like an ice cube fusing itself out of a glass of water!), we observe the system moving far from equilibrium. But imagine instead that there is no food. The ants just keep moving at random. No emergence, nothing of statistical interest happening. When we remove the driving external flow (food) that is outside of the ant system itself then the ants become like our molecules of water in a glass. Moving around in neutral, random configurations.  Eventually, without food, the ants will die - arriving at an even more extreme form of equilibrium (and then decay)!


 

Governing Features ↑

Degrees of Freedom

The notion of 'degrees of freedom' pertains to the potential range of behaviors available within a given system. Without some freedom for a system to change its state, no complex adaptation can occur.

This is a default subtitle for this page.


The notion of degrees of freedom comes to us from physics and statistics, where it describes the number of possible states a system can occupy. For example, a swinging pendulum is constrained to a limited number of 'states' (positions in space) that the pendulum can occupy. If we were to map all of the potential states of the pendulum's behavior we would have what is called a 'phase portrait' of the pendulum (see also Phase Space).

Understanding the degrees of freedom available within a complex system is important because it helps us understand the overall scope of potential ways in which a system can unfold. We can imagine that a given complex system is subject to a variety of inputs (many of which we might not know), but then we must ask, what is the system's range of possible outputs? The degrees of freedom tells us something about what a system is capable of doing: its potential. The system cannot act outside of the boundaries of this action potential.

For example, if we wanted to offer the maximum capacity for motion for a three-dimensional object own space, this can be provided using the following six degrees of freedom, which allow for changes in orientation (rotation) to occur through the 'roll', 'yaw' and 'pitch' dimensions, and displacement in space to occur through the 'up/down', 'back/forward' and 'left/right' parameters. We can see that all potentials of movement are covered within this framework.

If we were to have fewer degrees of freedom, certain types of movement would no longer exist as possibilities. Thus Phase Space - which alludes to the sum total of all potential behaviors - is sometimes referred to as 'possibility space'.

(image courtesy of Wikimedia commons)

So far we have been speaking about physical degrees of freedom, but we might also imagine degrees of freedom in relation to behavioral possibilities.

Imagine we are wanting to stay at an airbnb. We could think of each airbnb as being an agent in a system, competing to win us over with its 'fitness' for our stay. Each Airbnb would be able to tune a number of parameters that one might consider as important in choosing a particular place to stay. These parameters could include cost, cleanliness, distance to center, size, and quietness. Different people might value (or weigh) these parameters differently, and choose their airbnb accordingly. At the same time, we can imagine that each airbnb has a capacity to adjust its 'offering' to different degrees in different dimensions. Location is clearly a limited parameter: a given airbnb unit has no capacity to adapt its location far from the city to center. But it does have the capacity to adjust its price point. Size is also difficult to alter. But cleanliness might have more flexibility. If airbnbs can be considered as agent in a complex system, competing to find patrons who wish to stay at their collection, then they have to operate within the boundary of certain degrees of freedom in terms of how they seek to align themselves with different user needs. Thus if they can't compete on the basis of location then they can attempt to compete on the basis of cost.

The airbnb case should also serve to illustrate that, in many scenarios, the degrees of freedom available to an agent in a complex system cannot be plotted in three dimensional space. We can have a multi-parameter space that the agent is located within.

Degrees of freedom is thus another way of thinking about the responsive capacity of a system, in light of a range of environmental changes or fluctuations. Another way to describe this is the idea of a system's {{requisite-variety}} HANDLEBAR FAIL HANDLEBAR FAIL HANDLEBAR FAIL: a phrase coined by Ross Ashby to explain the amount of variability a system could enact. According to Ashby, a system needs to have a variety of responses commensurate with the variety of inputs.


 

Governing Features ↑

Cybernetics

Cybernetics is the study of systems that self-regulate: 'steering' the system so as to align operations with a pre-determined outcome.

This is a default subtitle for this page.


Cybernetics is an important early precursor to Complex Systems thinking.

The work Cybernetics comes from the Greek 'Kybernetes', from the Greek meaning 'steersman' or 'oarsman'. The word comes to us in English as the word 'Governor' - but in this case cybernetics are interested in dynamics that lead to internal rather than external governing.

A good example comes from the root greek work of 'steersman'. If we imagine a ship, sailing towards a target (say and island), there are various forces (wind and waves) that act upon the ship to push it away from its trajectory. In order to maintain a trajectory towards the island, the steersman need not be aware of the speed or direction of the wind, or the velocity of the waves. Instead, he (or she), just needs to keep their eye on the target, and keep adjusting the rudder of the ship to correct for any deviations from the route.

In a sense, we have here a complete system that works to correct for any disturbances. The system is comprised of the target, any and all forces pushing the ship away from the target, the steersman registering the amount of deviation, and subsequently counterbalancing this through means of interaction with the rudder.

While it is true that the steersman is the agent that 'activates' the rudder,  it is also true that the amount of deviation the target presents also 'activates' the steersman.  Finally, the forces acting upon the ship are what activates the deviation. We thus have a complete cybernetic system, where the forces at work form a continuous loop, and where the loop, in turn, is able to self-regulate.

A cybernetic system works to dampen  any disturbances or amplifying feedback that would move the trajectory away from a given optimum range. Thermostats work on cybernetic principles, where temperature fluctuations are dampened.

Like CAS, Cybernetics is concerned with how a system interacts with its environment. However, Cybernetics focus on systems subject to negative feedback: ones self-regulating to maintain regimes of stable equilibrium where disruptions (or Perturbations) are dampened.

Macy Conferences

Control

Watch Stafford Beer, one of a group of early proponents of Cybernetics,

discussing the Watt Flyball Regulator

As Cybernetics theory is concerned with how systems might self-regulate towards an optimum, its insights were considered to be relevant for any system seeking to optimize performance in the absence of an external regulator. Urban environments are one such system 0 comprised of parts together forming an environment that subsequently (in a recursive loop) regulates and alters the parts within. As such, throughout the 1960s and 1970s a natural offshoot of cybernetic thought was conceptualizing ways in which healthy urban environments might be stabilized through cybernetic principles.



 

Governing Features ↑

Attractor States

Complex Systems can unfold in multiple trajectories. However, there may be trajectories that become more stable of 'fit'. Such states are considered 'attractor states' to which a system tends to gravitate.

Attractor States or 'basins of attraction' and can be visualized as part of a fitness landscape.

Complex Adaptive Systems do not obey predictable, linear trajectories. They are Sensitive to Initial Conditions and small changes in these conditions can lead the system to unfold in entirely unexpected ways. That said, some of these 'potential unfoldings' are more likely to occur than others. We can think of these as 'attractor states' to which a system - out of all possible states - will tend to gravitate.  However, these attractor states may also shift over time, and are subject to system disruptions or  what is referred to as a Perturbation. Attractor states can also emerge gradually over time, as the system evolves, but once present can reinforce itself by constraining the actions of the agents forming the system. Thus we can think of Silicon Valley as being an emergent attractor for tech firms, that has, over time, reinforced its position. When a system finds itself 'trapped' in a basin of attraction (such that it cannot explore other potential configurations that may be more fit, it is considered to be in an Enslaved States .


Further, complex systems can sometimes oscillate between more than one attractor state. Thus, if we have a complex predator/prey ecosystem, the population numbers of each species might each rise and crash in a recurring pattern over multiple generations. This is because...

Any system that has both a driver and an inhibitor, and a time lag between the two, will tend to enter into such oscillating regimes. Here is an example of a chemical regime, known as the Briggs Rauscher chemical oscillator.

A number of terms and ideas are used to refer to attractor points in complex systems: areas to which the system gravitates.

There may be subtle differences in terms, but most of these differences in terminology arise depending on which field of inquiry is articulating the concept. Depending on if the 'source' field is mathematics, physics, or philosophy, different terms are used to reference the same phenomena: in this case the existence of 'attractor' states in complex systems that signify fitness. The following terminology is also used to reference the same general phenomena:


 

Governing Features ↑

 

Hello There

This is a nice home page for this section, not sure what goes here.

26:26 - Non-Linearity
Related
Concepts - 218 93 212 
Fields - 11 14 19 15 12 20 

23:23 - Nested Scales
Related
Concepts - 64 217 66 
Fields - 11 14 16 

24:24 - Emergence
Related
Concepts - 214 59 72 
Fields - 11 28 15 16 13 12 20 

25:25 - Driving Flows
Related
Concepts - 84 75 73 
Fields - 11 28 17 19 10 15 18 20 

22:22 - Bottom-up Agents
Related
Concepts - 213 78 
Fields - 11 14 10 16 13 12 18 

21:21 - Adaptive Capacity
Related
Concepts - 56 88 53 
Fields - 17 10 15 16 13 12 

 

Non-Linearity

Non-linear systems are ones where the scale or size of effects is not correlated with the scale of causes, making  them very difficult to predict.

Non-linear systems are ones in which a small change to initial conditions can result in a large scale change to the system's behavior over the course of time. This is due to the fact that such systems are subject to cascading feedback loops, that amplify slight changes. The notion has been popularized in the concept of 'the butterfly effect'.

The behavior of non-linear systems is governed by what is known as {{positive-feedback}}. What is interesting about positive feedback and the dynamics of non-linear systems is that they disrupt our normal understanding of causality. We tend to think that big effects are the result of big causes. Non-linear systems do not work that way, and instead a very small shift in initial conditions can result in massive system change. It therefore becomes very difficult to determine how an input or change will affect the system, with small actions inadvertently leading to big, unforeseen consequences.


Clarifying Terminology: Positive feedback does not imply a value judgement, with 'positive' being equated with 'good'! Urban decay is an example of a situation where positive feedback may lead to negative outcomes. A cycle of feedback might involve people divesting in a neighborhood, such that the quality of the housing stock goes down, leading to dropping property values at neighboring sites, further dis-incentiving improvements, leading to further disinvestment, etc.

History Matters!

The non-linearity of complex systems make them very difficult to predict, and instead we may think of complex adaptive systems as needing to unfold. Hence, {{history-matters}}, since slight variances in a system's history can lead to very different system behaviors.

Example:

A good example of this is comparing the nature of a regular pendulum to a double pendulum.

In the case of a regular pendulum,  regardless of how we start the pendulum swinging, it will stabilize into a regular oscillating pattern. The history of how, precisely, the pendulum starts off swinging does not really affect the ultimate system behavior. It will stabilize in a regular pattern regardless of the starting point, that can be replicated over multiple trials. The situation changes dramatically when we move to a double pendulum (a pendulum attached to another pendulum with a hinge point). When we start the pendulum moving the system will display erratic swinging behaviors - looping over itself and spinning in unpredictable sequences. If we were to restart the pendulum swinging one hundred times, we would see one hundred different patterns of behavior, with no particular sequence repeating itself. Hence, we cannot predict the pendulum's behavior, we can only watch the swinging system unfold. At best, we might observe that the system has certain tendencies, but we cannot outline the exact trajectory of the system's behavior without observing it:

watch the double pendulum!

We can think of the difference between this non-linear behavior and linear systems: if we wish to know the behavior of a billiard ball being shot into a corner pocket, we can calculate the angle and speed of the shot, and reliably determine the trajectory of the ball. A slight change in the angle of the shot leads to only a slight change in the ball's trajectory. if the behavior of a billiard ball on a pool table were like that of a complex system. Accordingly, people are able to master the game of pool based on practicing their shots! If pool behaved like a complex system it would be impossible to master: even with only the most minute variation in our initial shot trajectory, the ball would find its way to completely different positions on the table with every shot.

System Tendencies

That said, a non-linear system might still exhibit certain tendencies. If we allow a complex system to unfold many times (say in a computer simulation), while each simulation yields a different outcome (and some yield highly divergent outcomes), the system may have a tendency to gravitate towards particular regimes. Such regimes of behavior are known as Attractor States. Returning to the pendulum, in our single pendulum experiment the system always goes to the same attractor, oscillating back and forth. But a complex systems features multiple attractors, and the 'decision' of what attractor the system tends towards varies according to the initial conditions.

Complex systems can be very difficult to understand due to this non-linearity. We cannot know if a 'big effect' is due to an inherent 'big cause' or if it is something that simply plays out due to reinforcing feedback loops. Such loops amplify small behaviors in ways that can be misleading.

Example:

If a particular scholar is cited frequently, does this necessarily mean that their work has more intrinsic value then that of another scholar with far fewer citations?

Intuitively we would expect that a high level of citations is co-related with a high quality of research output, but some studies have suggested that scholarly impact might also be attributed to the dynamics of positive feedback: a scholar who is randomly cited slightly more often than another scholar of equal merit will have a tendency to attract more attention, which then attracts more citations, which attracts more attention, etc.. Had the scholarly system unfolded in a slightly different manner (with the other scholar initially receiving a few additional citations), the dynamics of the system could have led to a completely divergent outcome.  Thus, when we say that complex systems are "Sensitive to Initial Conditions"  this is effectively another way of speaking about the non-linearity of the system, and how slight, seemingly innocuous variation in the history of the system can have a dramatic impact on how things ultimately unfold.

Some Urban Thoughts

See also far from equilibrium.


 

Nested Scales

Complex Adaptive systems tend to organize themselves into hierarchical, nested and 'scale-free' systems.

Complex systems exhibit important scalar dynamics from two perspectives. First, they are often built up from nested sub-systems, which themselves may be complex systems. Second, at a given scale of inquiry within the system, there will be a tendency for the system to exhibit some form of power-law (or scale-free) dynamics in terms of how the system operates. This simply means that there will be a tendency in the system for a small number of elements within the system to dominate: this system domination can manifest in different ways, such as intensity (earthquakes) frequency (citations) or physical size (road networks). In all cases a small ratio of system components (earthquakes, citations, or roads) exert a large ratio of system impact. Understanding how and why this operates is important in the study of complexity.

Nested Features

To understand what we mean by 'nested', we can think of the human body. At one level of magnification we can regard it as a collection of cells, at another as a collection of organs, at another as a complete body. Further, each body is itself part of a larger collection - perhaps a family, a clan or a tribe - and these in turn, may be part of other, even larger wholes:  cities or nations. In complex systems we constantly think of both parts and wholes, with the whole (at one level of magnification) becoming just a part (at another level of magnification). While we always need to select a scale to focus upon, it is important to note that complex systems are open - so they are affected by what occurs at other scales of inquiry.


When trying to understand any given system within this hierarchy, the impact of subsystems typically occurs near adjacent scales. Thus, while a society can be understood as being composed of humans, composed of bodies, composed of organs, composed of cells, we do not tend to consider the role that cells play in affecting societies. Instead, we attune to understanding interactions between the relevant scales of whatever system we are examining.  Depending on the level of enquiry that we choose,  we may look at the same entity (for example a single human being) and consider it be an emergent 'whole',  or as simply a component part (or agent) within a larger emergent entity (one body within a complex society).

Various definitions of complexity try to capture this shifting nature of agent versus whole, and how this alters depending on the scale of inquiry. Definitions thus point to complex adaptive systems as being hierarchical, or operating at micro, meso, and macro level.  In his seminal article The Architecture of Complexity, Herbert Simon describes such systems as  'composed of interrelated sub-systems, each of the latter being, in turn, hierarchic in structure until we reach some lowest level of elementary subsystem'.

Why is this the case? And why does it matter?

Simon argues that, by partitioning systems into nested hierarchies, wholes are more apt to remain robust. They maintain their integrity even if parts of the system are compromised. He provides the example of two watch-makers, each of whom build watches made up of one thousand parts. One watchmaker organizes the watch's components as independently entities - each of which needs to be integrated into the whole in order for the watch to hold together as a stable entity. If one piece is disturbed in the course of the watchmaking, the whole disintegrates, and the watchmaking process needs to start anew. The second watchmaker organizes the watch parts into hierarchical sub-assemblies: ten individual parts make one unit, ten units make one component, and ten components make one watch. For the second watchmaker, each sub-assembly holds together as a stable, integrated entity, so if work is disrupted in the course of making an assembly, the disruption affects only that component (meaning a maximum of ten assembly steps are lost).  The remainder of the assembled components remain intact.

If Simon is correct, then natural systems may preserve robustness by creating sub-assemblies that each operate as wholes. Accordingly, it is worth considering how human systems might benefit from similar strategies.

Scalar Features

Simon's watchmaker is a top-down operator who organizes his work flow into parts and wholes to keep the watch components partitioned and robust, creating a more efficient watch-making process. What is noteworthy is that self-organizing  systems have inherent dynamics that appear to push systems towards such partitioning, and that this partitioning holds specific structural properties related to mathematical regularities.

A host of complex systems exhibit what is known as Self Similarity - meaning that we can 'zoom in' at any level of magnification and find repeated, nested scales.  These scale-free hierarchies follow the mathematical regularities of Power Laws distributions.  These distributions are so common in complex systems, that they are often referred to as 'the fingerprint of self-organization" (see Ricardo Solé).  We find power-law distributions in systems as diverse as the frequency and magnitude of earthquakes, the structure of academic citation networks, the prices of stocks, and the structure of the World Wide Web.

Further, complex systems tend to 'tune' themselves to what is referred to as Self-Organized Criticality: a state at which the scale or scope of a system's response to an input will follow power-law distribution,  regardless of the intensity (or scope) of the input. While not fully understood, it is believed that systems organize themselves this way because it is a regime in which systems are able to maximize performance while simultaneously using the minimum amount of available energy. When system are poised at this state they also have maximum connectivity with the minimum amount of redundancy. It is also believed that they are thus the most effective information processors in this regime.

Why Nested and not Hierarchical?

The attentive surfer of this website content may notice that in the various definitions of complexity being circulated, the term 'hierarchical' is used to describe what we call here 'nested scales'. We have avoided using this term as it holds several connotations that appear unhelpful. First, a hierarchy generally assumes a kind of priority, with 'upper' levels being more significant than lower. Second, it implies control emanating from the top down. Neither of these connotations are appropriate when speaking about complex systems. Each level of nested orders is both a part and a whole, and causality flows both ways as the emergent order is generated by its constituent parts, and steered by those parts as much as it steers (or constrains) its parts once present. We hope that the idea of 'nested scales' is more neutral vis-a-vis notions of primacy and control, but still captures the idea of systems embedded within systems of different scales.

To learn more about these phenomena, see

Power Laws

Self-Organized Criticality

{{scale-free}}

Urban Applications:

See also:


 

Emergence

Complex Adaptive Systems display emergent global features, which hold  characteristics that transcend that of the system's individual elements.

Emergence refers to the unexpected manifestion of unique phenomena appearing in a complex system in the absence of top-down control. It can refer both to these novel global phenomena themselves (such as ant trails, Benard rolls or traffic jams) or to the mathematical regularities - such as power-laws -  associated with them.

Overview:

When we see flocks of bird or schools of fish, they appear to operate as integrated wholes, yet the whole is somehow produced without any specific bird or fish being 'in-charge'. Nonetheless, emergent, integrated wholes are able to manifest through self-organizing, bottom-up processes, with these wholes exhibiting clear, functional, structures. These phenomena are intriguing in part due to their unexpectedness. Coordinated behaviors yield an emergent pattern or synchronized outcome that holds properties distinct from that of the individual agents in the system.


Starling Murmuration - an emergent phenomena

The processes leading to such phenomena are driven by networks of interactions that, because of feedback mechanisms,  gradually impose constraints or limits upon the agents within the system (see Degrees of Freedom).  Recursive feedback between agents take what was initially 'free' behavior, and gradually constrain or enslaves the behavior into coordinated regimes.

These coordinated, emergent regimes generally feature new behavioral or operational capacities that are not available to the individual element of the system. In addition, emergent systems often exhibit mathematical pattern regularities (in the form of power-laws) pertaining to the intensity of the emergent phenomena. These intensities tend to be observed in aspects such as spatial, topological or temporal distributions of the emergent features. There are pattern regularities associated with earthquake magnitudes (across time) city sizes (across space), and website popularity (across links (or 'topologically')).

Quite a lot of research in complexity is interested in the emergence of these mathematical pattern regularities, and sometimes it is difficult to decipher which feature of complexity is more important - what the emergent phenomena do (in and of themselves), versus the structural patterns or regularities that these emergent phenomena manifest.

Relation to Self-Organization:

Closely linked to the idea of emergence is that of self-organization, although there are some instances where emergence and self-organization occur in isolation from one another.

Example:

One interesting case of emergence without self-organization is associated with the so-called 'wisdom of crowds'. A classic example of the phenomena, (described in the book of the same name), involves estimating the weight of a cow at a county fair. Simultaneously, experts as well as non-experts were asked to estimate the cow's weight. Fair attendees were given the chance to guess a weight and put their guess into a box.  None of the attendees were aware of the estimates being made by others. Nonetheless, when all the guesses from the attendees were tallied (and divided by the number of guesses), the weight of the cow that the 'crowd' had collectively determined was closer than the weight of the cow estimated by experts. The correct weight of the cow 'emerged' from the collective, but no self-organizing processes were involved - simply independent guesses.

Despite there being examples of emergence without self-organization (as well as self organization without emergence), in the case of Complex Adaptive Systems these two concepts are highly linked, making it is difficult to speak about one without the other. If there is a meaningful distinction, it is that Self-Organization focuses on the character of interactions occurring amongst the Bottom-up Agents of a complex system, whereas emergence highlights the global phenomena that appear in light of theses interactions.

Enslavement:

At the same time, the concepts are interwoven, since emergent properties of a system tend to constrain the behaviors of the agents forming that system. Hermann Haken frames this through the idea of an Enslaved State, where agents in a system come to be constrained as a result of phenomena they themselves created.

Example:

An interesting illustration of the phenomena of 'enslavement' can be found in ant-trail formation. Ants, that initially explore food sources at random, gradually have their random explorations constrained due to the signals provided by pheromones (which are deployed as ants randomly discover food). The ants, responding in a bottom-up manner to these signals, gradually self-organize their search and generate a trail. The trail is the emergent phenomena, and self-organization - as a collective dynamic that is distributed across the colony - 'works' to steer individual ant behavior. That said, once a trail emerges, it acts as a kind of 'top-down' device that constrains subsequent ant trajectories.

Emergence poses ontological questions concerning where agency is located - that is, what is acting upon what. The source of agency becomes muddy as phenomena arising from agent behaviors (the local level) gives rise to emergent manifestations (the global level) which subsequently constrains further agent behaviors (and so forth). This is of interest to those interested in the philosophical implications of complexity.

There is a very tight coupling in these systems between a system's components and the environment that the components are acting within. One specific characteristic of the environment is that this environment also consists of system elements. Consequently, as elements shift in response to their environmental context, they are also helping produce a new environmental context for elements within that system. This results in the components and environment forming a kind of closed loop of interactions. These kinds of loops of behaviors, that lead to forms of self-regulation, were the object of study for early Cybernetics thinkers.

Urban Interpretations:

The concept of Emergence has become increasingly popular in urban discourses. While some urban features come about through top-planning (like, for example, the decision to build a park), other kinds of urban phenomena seem to arise through bottom-up emergent processes (for example a particular park becoming the site of drug deals). It should be noted that not all emergent phenomena are positive! In some cases, we may wish to help steward along emergent characteristics that we deem to be positive for urban health, while in other cases we may wish to try to dismantle the kinds of feedback mechanisms that create spirals of decay or crime.

The concept of emergence can be approached very differently depending on the aims of a particular discourse. For example, Urban Modeling often highlights the emergence of Power Laws in the ratio of different kinds of urban phenomena. A classic example is the presence of power law distributions in city sizes, which looks at how the populations of cities in a country follows a power-law distribution,  but one can also examine power law distributions within rather than between cities, examining such characteristics such as road systems, restaurants, or other civic amenities.

Others, such as those engaged in the field of Evolutionary Economic Geog. (EEG) are intrigued by different kinds of physical patterns of organization.  EEG attunes to how 'clusters' of firms or 'agglomerations' appear in various settings, in the absence of top-down coordination.  They try to unpack the mechanisms whereby firms are able able to self-organize to create these clusters, rather then looking at any particular mathematical regularities or power-law attributes associated with such clusters.

Still other urban discourses, including Relational Geography and Assemblage Geography, are focused on how agents come together to create a host of entities: buildings, institutions, building plans, etc. - but tend to place their attention on coordination mechanisms and flows that steer how such entities are able to emerge.

Accordingly, different discourses attune to very different aspects fo complexity.

Proviso:

While this entry provides a general introduction to emergence (and self-organization), there are other interpretations of these phenomena that disambiguate these concepts with reference to Information theory. These interpretations focus upon the amount of information (in a Shannononian sense) required to describe self-organizing versus emergent dynamics.

While these definitions can be instructive, they remain somewhat controversial. There is no absolute consensus about how complexity can be defined using mathematical measures (for an excellent review on various measures, check the FEED for Ladyman, Lambert and Weisner, 2012). Often, an appeal is made to the idea of 'somewhere between order and randomness'. But this only tells us what complexity is not, rather than what it is. The explanation provided here is intended to outline the terminology in a more intuitive way, that, while not mathematically precise, makes the concepts workable.

Hopefully, we are able to recognize complexity when we see it!

Related social sciences terms:

Stabilized Assemblages


 

Driving Flows

Complex Systems exchange energy and information  with their surrounding contexts. These flows between system and context help structure the system.

According to the second law of thermodynamics a system, left to its own devices, will eventually lose order: hot coffee poured into cold will dissipate its heat until all the coffee in the cup is of the same temperature; matter breaks down over time when exposed to the elements; and systems lose structure and differentiation. The same is not true for complex systems. They gain order, structure, and information.

This is because such systems, while operating as bounded 'wholes', are not entirely bounded. They remain open to the environment, and the environment, in some fashion, 'feeds' or 'drives' the system: providing energy that can be used by the system to build and retain structure.   Thus complex systems violate the second law of thermodynamics and, rather then tending towards disorder (entropy), they are pushed towards order (negentropy).


What constitutes a flow?

In general, we can conceptualize flows as some form of energy that helps drive or push the system. But what do we mean by energy? And what kinds of energy flows should we pay attention to in the context of complexity?

In some cases, the source of system energy aligns with a strictly technical definition of what we think of when we say 'energy'. Such is the case in the classic example of 'Benard rolls' (see Open / Dissipative).  Here, a coherent, emergent 'roll' pattern is generated by exciting water molecules by means of a heat source.  It becomes relatively straightforward to identify thermal energy as the flow driving the system: heat enters the water system from below, dissipates to the environment above, and drives emergent water roll activity in between.

But there are a host of different kinds of complex systems where we see all kinds of driving flows that do not necessarily have their dynamics directed in accordances with this strict conception of 'energy'.

Example:

In an academic citation network, citations could be perceived as the 'energy' or flow that drives the system towards self-organization. As more citations are gathered, a scholar's reputation is enhanced, and more citations flow towards that scholar.  A pattern of scholarly achievement emerges (that follows a power-law distribution), due to the way in which the 'energy flows' of scholarly recognition (citations), are distributed within the system. While we tend to think that citations are based on merit, a number of studies have been able to replicate patterns that echo citation distribution ratios using only the kinds of mechanisms we would expect to operate within a complex system - with no merit required (see also Preferential Attachment!).
Similarly, the stock market can be considered as a complex adaptive system, with stock prices forming the flow which helps to steer system behavior; the world wide web can be considered as a complex adaptive system, with the number of website clicks serving as a key flow; the ways in which Netflix organizes recommendations can be considered as a complex adaptive systems, with movies watched serving as the flow that directs the system towards new recommendations.

Clearly, it is helpful to understand the nature of the driving flows within any given complex system, as this is what helps push the system along a particular trajectory. For ants, (who form emergent trails), food is the energy driving the system. The ants adjust their behaviors in order to gain access to differential flows (or sources) of food in the most effective way possible given the knowledge of the colony. In this case, the caloric value of food stocks found is a good way to track the effectiveness of ant behavior.

If we look at different systems, we should be able to somehow 'count' how flow is directed and processed: citation counts, stock prices, website clicks, movies watched.

Multiple Flows:

Often complex systems are subject to more than one kind of flow that steers dynamics. For example, we can look at the complex population dynamics of a species within an environment with a limited carrying capacity. Here, two flows are of interest: the intensity of reproduction (or the flow of new entrants into the environmental context), and the flow of food supplies (that limits how much population can be sustained). Here one flow rate drives the system (reproductive rate), while another flow rate chokes the system (carrying capacity). This interactions between two input flows (one driving and the other constraining), produces very interesting emergent dynamics that lead the system to oscillate or move periodically from one 'state' (or attractor) to another. A more colloquial way of thinking about this periodic cycling is captured in the idea of 'boom' and 'bust' cycles, although there are other kinds of cycles that involve moving between more than two regimes (see Bifurcations for more!).

Go with the flow:

Flow is the source of energy that drives self-organizating processes. A complex system is a collection of agents that are operating within a kind of loose boundary, and flow is what comes in from the outside and is then processed by these agents.  Food is not part of the ant colony system, but it is what drives colony dynamics. The magic of self-organization is that, rather than each agent needing to independently figure out how best to access and optimize this external flow, each agent can learn from what its neighbors are doing.

Accordingly, there are two kinds of flows in a complex system - the external flow that needs to be internalized and processed, and the internal flows amongst agents that help signal the best way to get the job done. As agents move into regimes that process flows in ways that minimize energy requirements, they draw other agents along into similar regimes of behavior making the system, as a whole, an efficient energy processor.

Its all about difference:

Every complex system channels its own specific form of driving flow.

In every case, it is important to look beyond technical definitions of energy flows in complex systems, to instead understand these as the differences that matter to the agents in the system. All complex systems involve some sort of differential, and this differential is regulated by an imbalance of flows, that thereby steer subsequent agent actions.  As the system realigns itself through  attuning to these differentials, new behaviors or patterns emerge that, in some way, optimize behaviors.


 

Bottom-up Agents

Complex Adaptive Systems are comprised of multiple, parallel agents, whose coordinated behaviors lead to emergent global outcomes.

CAS are composed of populations of discreet elements - be they water molecules, ants, neurons, etc. - that nonetheless behave as a group. At the group level, novel forms of global order arise strictly due to simple interactions occurring at the level of the elements. Accordingly, CAS are described as "Bottom-up": global order is generated from below rather than coordinated from above.  That said, once global features have manifested they stabilize - spurring a recursive loop that alters the environment within which the elements operate and constraining subsequent system performance.

What is an Agent?

Bikes, Barber shops, Beer glasses, Benches. We can ask what an agent is, but we could equally ask what an agent is not!

Defining an agent is not so much about focusing on a particular kind of entity, but instead about defining a particular kind performance within a given system and that system's context. Many elements of day-to-day life might be thought of agents, but to do so, we need to ask how agency is operationalized.


Imagine that I have a collection of 1000 bikes that I wish to place for rent in the city. Could I conceive of a self-organizing system where bikes are agents - where the best bike arrangement emerges, with bikes helping each other 'learn' where the best flow of consumers is? What if I have 50 barber-shops in a town of 500 000 inhabitants - should the shops be placed in a row next to one another? Placed equidistant apart? Distributed in clusters of varying sizes and distances apart (maybe following power laws?). Might the barber shops be conceptualized as agents competing for flows of customers in a civic context, and trying to maximize gains while learning from their competitors? And what about beer glasses: if I have a street festival where I want all beer glasses to wind up being recycled and not littering the ground, what mechanisms would I need to put into place in order to encourage the beer glasses to act as agents - who are more 'fit' if they find their ways into recycling depots? How could I operationalize the beer glasses so that they co-opt their consumers to assist in ensuring that this occurs?. What would a 'fit' beer glass be like in this case (hint: high priced deposit?). Finally, who is to say where the best place is to put a park bench? If a bench is an agent, and 100 benches in a park are a system, could benches self-organize to position themselves where they are most 'fit'?

The examples above are somewhat fanciful but they are being used to illustrate a point: there is no inherent constraint on the kinds of entities we might position as agents within a complex system. Instead, we need to look at how we frame the system, and do so in ways where entities can be operationalized as agents.

Operational Characteristics:

  • having a common fitness criteria shared amongst agents (with some performances being better than others),
  • having an ability to shift performance (see Requisite Variety end handlebar),
  • having an ability to exchange information amongst other agents (get to better performance faster).
  • operating in an environment where there is a meaningful difference available that drives behavior (see Driving Flows end handlebar)

'Classic' Agents

An odd list of potential agentic entities have been provided above (the 'B' list) that might, under specific circumstances be transformed into operational agents. We start here to avoid the problem of limiting the scope of what may or not be an agent. That said, these are not part of what might be thought of as the complexity 'cannon' of Agents  (the 'A' list). Let us turn to these now:

Those drawn to the study of complex systems were initially compelled to explore agent dynamics because of certain examples that showed highly unexpected emergent aspects. These include 'the classics' (described elsewhere on this website) that include:  emergent ant trails, coordinated by individual ants, emergent percolation patterns, coordinate by water molecules in Benard/Rayleigh convection, emergent higher thought processes, coordinated by individual neuron firing.

In each case, we see natural systems composed of a multitude of entities (agents) that, without any level of higher control, are able to work together to coalesce into something that has characteristics that go above and beyond the properties of the individual agents. But if we consider the operational characteristics at play, they are no different from the more counter-intuitive examples listed above. Take ants as an example. They are an agent that has:

  • a common fitness criteria shared amongst agents (getting food),
  • an ability to shift performance (searching a different place)
  • an ability to exchange information amongst other agents (deploying/detecting pheromones)
  • an environment where there is a meaningful difference available that drives behavior (presence of food)

Ant trails emerge as a result of ant interaction, but the agents in the system are not actively striving to achieve any predetermined  'global' structure or pattern: they are simply behaving in ways that involve an optimization of their own performance within a given context, with that context including the signals or information gleaned form other agents pursuinng similar performance goals. Since all agents pursue identical goals, coordination amongst agents leads to a faster discovery of fit performance regimes. What is unexpected is that, taken as a collective, the coordinated regime has global, novel features. This is the case in ALL complex systems, regardless of the kinds of agents involved.

Finally, once emergent states appear, they constrain subsequent agent behavior, which then tends to replicate itself.  Useful here are Maturana and Varella's notion of autopoiesis as well as Hermann Haken's concept of Enslaved States. Global order or patterns (that emerge through random behaviors conditioned by feedback) tend to stabilize and self-maintain.

Modeling Agents:

While the agents that inspired interest in complexity operate in the real world, scientists quickly realized that computers provided a perfect medium with which to explore the kind of agent behaviors we see operating. Computers are ideal for exploring agent behavior since many 'real world' agents obey very simple rules or behavioral protocols, and because the emergence of complexity occurs as a step by step (iterative) process.  At each time step each agent takes stock of its context, and adjusts its next action or movement based on feedback from the last move and from the last move of its neighbors.

Computers are an ideal format to mimic these processes since, with code, it is straightforward to replicate a vast population of agents and to run simulations that enable each individual agent to adjust its strategy at every time step. Investigations into such 'automata' informed the research of early computer scientists, including such luminaries as Epstein & Axtell, Von-Neumann, Wolfram, Conway and (for more on their contributions check 'key thinkers in the FEED!).

In the most basic versions of these automata, agents are considered as cells on an infinite grid, and cell behavior can be either 'on' or 'off' depending on a rule set that uses neighboring cell states as the input source.

Conway's Game of Life: A classic cellular automata

These early simulations employed Cellular Automata (CA), and later moved on to Agent-Based Models (ABM) which were able to create more heterogeneous collections of agents with more diverse rule sets. Both CA and ABM aimed to discover if stable patterns of global agent behaviors would emerge through interactions carried out over multiple iterations at the local level. These experiments successfully demonstrated how order does emerge through simple agent rules, and simulations have become, by far, the most common way of engaging with complexity sciences.

While these models can be quite dramatic, they are just one tool for exploring the field and should not be confused with the field itself. Models are very good at helping us understand certain aspects of complexity, but less effective in helping us operationalize complexity dynamics in real-world settings. Further, while CA and ABM demonstrate how emergent, complex features can arise from simple rules, the rule sets involved are established by the programmer and do not evolve within the program.

A further exploration of agents in CAS incorporates the ways in which bottom-up agents might independently evolve rules in response to feedback. Here agents test various Rules/Schemata over the course of multiple iterations. Through this trial and error process, involving Time/Iterations, they are able to assess their success through Feedback and retain useful patterns that increase Fitness. This is at the root of machines learning, with strategy such as genetic algorithms mimicking evolutionary trial and error in light of a given task.

competing agents are more fit as they walk faster!

John Holland describes how agents, each independently exploring suitable schema, actions, or rules, can be viewed as adopting General Darwinian processes involving Adaptive processes to carry out 'search' algorithms. In order of this search to proceed in a viable manner, agents need to possess what Ross Ashby dubs Requisite Variety: sufficient heterogeneity to test multiple scenarios or rule enactment strategies. Without this variety, little can occur.  It follows that, we should always examine the capacity of agents to respond to their context, and determine if that capacity is sufficient to deal with the flows and forces they are likely to encounter.

Further, we can speed up the discovery of 'fit' strategies if we have one of two things: more agents testing or more sequential iterations of tests. Finally, we benefit if improvements achieved by one agent can propagate (reproduced), within the broader population of the general agents.


 

Adaptive Capacity

Complex systems are able to adjust their behaviors in response to inputs. This adaptive capacity allows these systems to achieve a better 'fit' within their context.

We are all familiar with the concept of adaptation as it relates to evolution, with Darwin outlining how species diversity is made possible by the mutations that enhance a species' capacity to survive and thereby reproduce. Over time, mutations that are well-adapted to a given context will survive, and ill-adapted ones will perish. Through this simple process - repeated in parallel over multiple generations - species are generated that are supremely tuned to their environmental context. While adaptation principles have their basis in biological evolution, a more 'general' Darwinism looks to processes outside the biological context to see how similar mechanisms may be at play in a broad range of systems. Accordingly, any system that has the capacity for Variation, Selection, and Retention (VSR), is able to adapt and become more 'fit'.


Eye on the target - Identifying what is being adapted for:

All complex systems involve channeling flows in the most efficient way possible - achieving the maximum gain for the minimal output - and 'discovering' this efficiency can be thought of as achieving a 'fit' behavior. When looking at a system's adaptive behavior, one therefore needs to first understand how fit regimes are operationalized, by considering:

  1. What constitutes a 'fit' outcome;
  2. How the system registers behaviors that move closer to this outcome (see Stigmergy end handlebar);
  3. The capacity of agents in the system to adjust their behaviors so as to better align with strategies moving closer to the 'fit' goal.

It is this third point, point pertaining to the 'adaptive capacity' of agents that we wish to look at in more depth.

Variation, Selection, Retention (VSR):

If we consider the example of ant trail formation, behaviors that lead to the discovery of food are more 'fit'. Adopting the lens of Variation, Selection and Retention, the system unfolds as follows:

  1. A collection of agents (ants), seek food (environmental differential) following random trajectories (Variation).
  2. Ants that randomly stumble upon food leave a pheromone signal in the vicinity. This pheromone signal indicates to other ants that certain trajectories within their random search are more viable then others (Selection).
  3. Ants adjust their random trajectories according to the pheromone traces, reinforcing successful food pathways and broadcasting these to surrounding members of the colony (Retention).

What emerges from this adaptive process is an ant trail: a self-organizing phenomena that has been steered by the adaptive dynamics of the system seeking to minimize the global system energy expended in finding food. What is important to note is that the adaptation occurs at the level of the entire group, or system. The colony as a whole coordinates their behavior to achieve overall fitness, with food availability (the source of fitness) being the differential input that drives the system. The ants help steer one another and, overall, the behavior of the colony is adaptive. Individual ants might still veer off track and deplete energy looking for food, but this remains a useful strategy in cases where existing food sources become depleted. Transfer of information about successful strategies is critical to ensuring that more effective variants of behavior propagate throughout the colony.

None of this is meant to imply that, if the ants follow this protocol, they will find the most abundant food source available. Complexity does not necessarily result in perfect emergent outcomes. What it does result in is outcomes that are 'satisficing' and that allocate system resources as effectively as possible within the constraints of limited knowledge. Further, the system can change over time, meaning that other, more optimum performance levels may be discovered as time unfolds.

Capacity to Change:

An agent's ability to vary its behavior, select for behaviors that bring it closer to a goal, and then retain (or reproduce), these behaviors, is what makes agents in a complex system 'adaptive'. If agents do not possess the capacity to change their outputs in response to environmental inputs, then no adaptive processes can occur.

While this might at first seem self-evident, this basic concept can often be overlooked. In particular, it is easy to think about a system composed of diverse components as being 'complex' without considering whether or not the elements within the system have some inherent ability to adjust in relation to this complex context -

Example:

Consider an airplane. It is a system comprised of a host of components and together these components interact in ways that makes flight possible. That said, each component is not imbued with the inherent ability to adjust its behavior in response to shifting environmental inputs. The range of behaviors available to the plane's components are fixed according to pre-determined design specifications. The machine components are not intended to learn how to fly better (adjusting how they operate) in response to feedback they receive over the course of every flight.

If we try to understand an airplane as a complex system, and identify 'flying better' (using less energy to go further) as our measure of fitness, then would it be meaningful to speak about the system adapting? It the agents in the plane's system are the individual components, are they capable of variation, selection, and retention? Even if we were to model system behavior from the top down testing design variants in components, the system itself would not really be 'self-organizing' in this instance: without external tinkering nothing would happen.

'Seeking' fitness without volition:

Does it follow that inanimate objects are incapable of self-organization without top down control?

It is reasonably easy to understand adaptation within a system where the agents posess some form of volition. What is intriguing is that many complex systems move towards fit regimes, regardless of whether or not the agents of the system have any sort of 'agency' or awareness regarding what they do or do not do.

Example: Coordination of Metronomes:

In the video below, we see a group of metronomes gradually coordinating their behaviors so as to synchronize to a regular rhythm and direction of motion. While this is an emergent outcome, it is initially unclear how one might see this as a kind of 'adaptation'. But if we look to the principles of VSR, we see how this occurs. First we observe a series of agents (metronomes), displaying a high degree of variety in how they beat (in relation to one another). The system has a shared environmental context (the plank upon which the metronomes sit), which acts as a subtle means of signal transfer between the metronomes. The plank moves parallel to the direction of metronome motion, creating resistance or 'drag' in relation to the oscillation of the metronomes on its surface. Some individual metronome movements encounter more resistance in relation to this environment (the sliding plank), while some movements encounter less (a more efficient use of energy). These differentials cause each metronome to alter its rhythm - ever so slightly - until all metronomes move in sync.

Watch the metronomes go into sync!

Considered as VRS we observe the following:

  1. There is a Variation in the metronome movements with certain oscillatory trajectories encountering more friction and resistance then others;
  2. The physics of these resistance forces create a Selection mechanism, whereby each metronome alters its oscillatory patterns in response to how much resistance it encounters.
  3. As more metronomes enter into coordinated oscillating regimes, this in turn generates more resistance force being exerted on any outliers, gradually pushing them into sync. Once tuned to this synchronized behavior,  the system as a whole optimizes its energy expenditure, and the behavior persists (Retention).

Keep it to a minimum!:

The system adapts to the point where overall resistance to motion is minimized. The metronomes 'achieve' the most for the least effort: a kind of fitness within their context.

While the form of 'minimization' varies, all complex systems involve seeking out behaviors that conserve energy - where the system, as a whole,  processes the flows it is encountering using the least possible redundant energy. While this cannot always be perfectly achieved, it is this minimizing trajectory that helps steer the system dynamics.


 


 

Fields Galore!

This is a nice home page for this section, not sure what goes here.

11:11 - Urban Modeling
Related

56, 88, 72, 
26, 23, 24, 25, 22, 

28:28 - Urban Datascapes
Related

66, 73, 72, 
24, 25, 22, 

17:17 - Tactical Urbanism
Related


25, 21, 

14:14 - Resilient Urbanism
Related


26, 23, 22, 

19:19 - Relational Geography
Related

84, 75, 
26, 25, 

10:10 - Parametric Urbanism
Related

75, 78, 
25, 22, 21, 

15:15 - Landscape Urbanism
Related

56, 88, 
26, 24, 25, 21, 

16:16 - Informal Urbanism
Related

56, 88, 
23, 24, 22, 21, 

13:13 - Generative Urbanism
Related

56, 88, 
24, 22, 21, 

12:12 - Evolutionary Geography
Related

93, 88, 72, 
26, 24, 22, 21, 

18:18 - Communicative Planning
Related


25, 22, 

20:20 - Assemblage Geography
Related


26, 24, 25, 

 

Urban Modeling

Cellular Automata & Agent-Based Models can provide city simulations whose behaviors we learn from. What are the strengths & weaknesses of this mode of engaging urban complexity?

Governing Features ↑

There is a large body of research that employs computational techniques - in particular agent based modeling (ABM) and cellular automata (CA) to understand complex urban dynamics. This research looks at rule based systems that yield emergent structures.


Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato.

Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Urban Datascapes

Increasingly, data is guiding how cities are built and managed.  'Datascapes' are derived from our actions but also then steer them. How do humans and data interact in complex ways?

Governing Features ↑

More and more, the proliferation of data is changing the ways in which we inhabit space... and so forth.


Key Concepts:

{{networks}}

{{information}}

Fitness

Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Tactical Urbanism

Tactical interventions are light, quick and cheap - but if deployed using a complexity lens, could they be a generative learning tool that helps make our cities more fit?

Governing Features ↑

Tactical Urbanism is a branch of urban thinking that tries to understand the benefits of grassroots, bottom-up initiatives in creating meaningful urban space. While not associating itself directly with complexity theory, many of the tools it employs -particularly its way of 'learning by doing' ties in with adaptive and emergent concepts from complexity theory.


Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Will this work or do we still have a bug in the site?

Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Resilient Urbanism

How can our cities adapt and evolve in the face of change? Can complexity theory help us provide our cities with more adaptive capacity to respond to uncertain circumstances?

Governing Features ↑

Increasingly, we are becoming concerned with how we can make cities that are able to respond to change and stress. Resilient urbanism takes guidance from some complexity principles with regards to how the urban fabric can adapt to change.


Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato.

Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Relational Geography

If geography is not composed of places, but rather places are the result of relations, then how can an understanding of complex flows and network dynamics help us unravel the nature of place?

Governing Features ↑

Relational Geographers examine how particular places are constituted by forces and flows that operate at a distance. They recognize that flows of energy, people, resources and materials are what activate place, and focus their attention upon understanding the nature of these flows. {{network-topology}} {{path-dependency}} Relational Geography


Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato.

This is a big heading

Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced.

This is a small heading

Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Parametric Urbanism

New ways of modeling the physical shape of cities allows us to shape-shift at the touch of a keystroke.  Can this ability to generate a multiplicity of possible future urbanities help make better cities?

Governing Features ↑

Parametric approaches to urban design are based on creating responsive models of urban contexts that are programmed to change form according to how inputs are varied. Rather than the architect creating a final product, they instead create a space of possibilities that is activated according to how various flow variables - economic, environmental, or social, are tweaked. This form of architectural form making holds similarities to complex systems in terms of how entities are framed: less as objects in and of themselves, and more as responsive, adaptive agents.


Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto.

Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Landscape Urbanism

Landscape Urbanists are interested in adaptation, processes, and flows, with their work drawing from the lexicon of complexity sciences.

Governing Features ↑

A large body of contemporary landscape design thinking tries to understand how designs can be less about making things, and more about stewarding processes that create a 'fit' between the intervention and the context. Landscape Urbanists advancing these techniques draw a large portion of their vocabulary from the lexicon of complex adaptive systems theory.


Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced.

Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto.


 

Informal Urbanism

Many cities around the world self-build without top-down control. What do these processes have in common with complexity?

Governing Features ↑

Cities around the world are growing without the capacity of top-down control. Informal urbanism is an example of bottom-up processes that shape the city. Can these processes be harnessed in ways that make them more effective and productive?


Sweet grinder java

Fid milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go.

So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.

Black Pumpkin Spice

Starting Place

Proast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced.

Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.

Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced.

So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato.

  1. Urban Fields
  2. URBAN DATASCAPES
  3. URBAN MODELING
  4. TACTICAL URBANISM
  5. RESILIENT URBANISM
  6. RELATIONAL GEOGRAPHY
  7. PARAMETRIC URBANISM
  8. LANDSCAPE URBANISM
  9. INFORMAL URBANISM
  10. GENERATIVE URBANISM
  11. EVOLUTIONARY ECONOMIC GEOG.
  12. COMMUNICATIVE PLANNING
  13. ASSEMBLAGE GEOGRAPHY

Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced.


 

Generative Urbanism

Cities traditionally evolved over time,  shifting to meet user needs. How might complexity theory help us  emulate such processes to generate 'fit' cities?

Governing Features ↑

Some Urban thinkers consider how the nature of the morphologic characteristics of the city help enable it to evolve, incrementally, over time. This branch of Urban Thinking considers time and evolution as key to generating fit urban spaces


Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto.

Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Evolutionary Geography

Across the globe we find spatial agglomerations of common economic activity. How does complexity help us understand the emergence of economic clusters?

Governing Features ↑

Evolutionary Economic Geography (EEG) tries to understand how economic agglomerations or clusters emerge from the bottom-up. This branch of economics draws significantly from principles of complexity and emergence, seeing the rise of particular regions as being path-dependent, and trying to understand the forces at work that drive change for economic agents - the firms that make up our economic environment.


Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go. So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Communicative Planning

Communicative planning  broaden the scope of voices engaged in planning processes. How does complexity help  us understand the productive capacity of these diverse agents?

Governing Features ↑

A growing number of spatial planners are realizing that they need to harness many voices in order to navigate the complexity of the planning process. Communicative strategies aim to move from a top-down approach of planning, to one that engages many voices from the bottom up.


Sweet grinder java, id milk single shot bar robusta milk, cream, beans as cultivar café au lait aftertaste saucer. Dark, cortado, est, coffee fair trade extra cortado turkish, variety, eu extraction crema french press robusta extra est shop trifecta aftertaste siphon. Variety, dripper coffee bar robusta americano cream carajillo lungo café au lait cinnamon grounds to go.

So, aromatic black pumpkin spice and roast, as variety extraction aftertaste americano, aromatic turkish brewed breve brewed ristretto. Crema irish eu breve viennese, arabica white iced barista mocha single origin strong shop robust, café au lait, fair trade kopi-luwak shop macchiato extra arabica macchiato. Wings, carajillo medium, rich, java americano grounds, viennese, cinnamon, caramelization java dark con panna iced, et, ut americano whipped and affogato. Bar aftertaste, galão, espresso that, a, crema, espresso skinny acerbic, iced aged dripper french press macchiato, latte pumpkin spice spoon cup pumpkin spice single shot rich aromatic iced. Et half and half cappuccino aged dripper, half and half, grinder et, white, coffee body viennese, milk shop viennese, barista in plunger pot macchiato that black. Seasonal extraction organic, black, single shot crema roast black galão latte, saucer plunger pot qui redeye coffee.


 

Assemblage Geography

Might the world we live in be made up of contingent, emergent 'assemblages'? If so, how might complexity theory help us understand such assemblages?

Governing Features ↑

Assemblage geographers consider space in ways similar to relational geographers. However, they focus more on the temporary and contingent ways in which forces and flows come together to form stable entities. Thus, they are less focused upon how relations are structured, and more upon the nature of the assemblages that come to exist as a result of shifting relations.

Assemblage geographers seize upon concepts of path-dependence and Bifurcations: moments when chance events determine the trajectory of systems that are sensitive to historical unfolding. Manuel de Landa explains how in order to properly conceptualize actualized geographical space, it is necessary to see this as being the manifestation of only one particular trajectory - situated within a much broader  Phase Space of The Virtual potentials. This introduction of history situates urban systems as subject to Contingency, with their actual behaviors representing only one possible trajectory of a much broader phase space potential.

Assemblage theorists frame the concept of Emergence in a much more philosophical manner. Following the works of the philosophers Deleuze and Guattari, they describe concrete urban entities as emergent, in determinant and historically contingent Stabilized Assemblages. Assemblages are brought into existence through distributed agency. The notion of and 'Assemblage', echoes that of an emergent characteristics, and some geographers have suggested the phrase 'Complex Adaptive Assemblage' in place of 'Complex Adaptive System'. Assemblages are configurations of inter-meshed forces - human/non-human, local/non-local, material, technical, social, etc., that are stabilized at particular moments. Once in place - like emergent features, these take on agency in structuring further events. Agents in a particular assemblage have particular capacities which one might see as analogous to Degrees of Freedom, but how these capacities manifest is subject to Contingency: predicated on the nature of flows, forces, or the Patterns of Interactions at play in a given situation.