Monday, December 29, 2014

Narratives of Causation, Structures of Reasoning

You can’t connect the dots looking forward. You can only connect them looking backwards, so you have to trust that the dots will somehow connect in your future.” Steve Jobs
In “Mind and Nature: A necessary unity” Gregory Bateson noted that – Science probes; it does not prove. Although, we often assert that some latest piece of research has proved some theory or hypothesis. To definitively prove something, we would have to verify all possible instances of a claim – for example to prove all swans were white one would have to show that all swans that have existed, currently exist and will exist are in fact white. This is impossible.

What science does do, however, is to assemble evidence that supports a claim that provides a more ‘reasonable’ explanation than any other and/or makes better, more precise predictions.
But rather than deep discussion of epistemology (how we know what we know), I want to explore how the way we frame our observations shapes the way we attribute causation, and therefore structures how we reason to form theories and explanations. Most of the time we observe a correlation—first X happens and then Y happens, and we want to explain why. We try to fit our observations of the sequences of events into patterns (stories) that make sense of what we think is unfolding.

Humans are driven to generate explanations for why things happened the way they did. That’s not to say that our explanations are unimportant – as Kurt Lewin says: “There is nothing more practical than a good theory”. Theories help us to determine what questions to ask. Theories are the way scientists create stories that link causes and effects into coherent wholes or systems. Understanding whole system is vital because the ‘causes’ derived as systemic properties are indirect and often very hard to track. The difficulty of understanding systemic causation is evident in the difficulty of people to grasp climate change. What a good story-as-theory can do is enable our mind to grasp complex cause in a way that is more like direct causation.

When our stories are well done they produce compelling, plausible and common-sense explanations of events. In this way we see that while data is fundamental – the data themselves will remain mute without some form of understandable theory. Theory and hypothesis generation in turn depends on honed intuition and creative imagination. It is story as theory, which provides the structure that shapes how we reason about a problem and the framework that makes a question appropriate.

The traditional scientific methods and approaches that relied on reduction to attain simplicity tend to be unsuited to understanding complex problems and systems. However, a paradox arises in the development of the various fields of research, study and application that constitute the sciences of complexity. These sciences have shown us that even a fully deterministic system can be essentially unpredictable due to sensitivity to initial conditions as well as a plethora of other reasons. For example, depending on different contexts, identical causes can produce different effects. Also there are some identical effects that can be produced by different causes. Despite knowing about these sorts of conditions of change – we continue to feel confident in constructing causal explanations all the while cautioning ourselves that correlation is not causation.

In our day-to-day lives we tend to explain a sequence that culminates in an outcome as cause and effect, after the outcome is already known, since we have observed a clear sequence of events. The explanation is often reduced to picking from a repertoire of common sense ‘causes’ that appear self-evidently true. We construct accounts that both make sense and seem like causal explanations. This is why hindsight can so often give us a clear sequence of events to which we can attribute causes and effects.

What this approach often misses is the range of possible alternatives that did not happen. We overlook the fact that there could have been innumerable ‘butterflies’ involved in what has happened and that these ‘butterflies’ could just as well have been different – and thus produced different outcomes. We mistakenly apply the seeming clarity of hindsight onto the future, in an effort to constrain the world of possibilities into a bounded ‘ink-blot’ upon which we can project probabilities.

It’s not as if there isn’t any regularity in the world – there is, and science will continue to enable us to observe reliable regularity and harness it. The point is that we must not confuse our capacity to determine robust regularity with causality – confusing effective know-how with the certainty of know-why. Most often, in order to observe and determine regularity we create arbitrary contained situations in order to manipulate a reduced set of observable variables until we have created reproducible correlations. But after we have developed these stable correlations – we still don’t know what will happen once we return our creation to the ‘wild’. This is especially true for the biological and social sciences.

George Lakoff (among a number of other cognitive scientists) has provided irrefutable evidence that argues that reasoning itself is not solely structured as a universal mathematical logic. What humans do when we think is use three cognitive capacities: we think in chunks (events, space, entities, predicates); we use frames (roles, situational relations); and inescapably we rely on metaphors (as cross-domain mappings – frame to frame mappings). With these cognitive tools we create narratives that frame sequences (often linked with emotions some form of moral stance). Narratives also allow us to enact simulations of our goals, satisfactions and frustrations. What Lakoff points out is that all words are defined in frames.

Lakoff notes that there is no language that has in its grammar a natural way to express systemic causation what exists everywhere are notion of if-then causality. This is also why so many people have difficulty understanding complex systems. Despite the best intentions of those practicing any form of science - the objectivity of science is inevitably vulnerable to particular structures of reasoning inherent in the chunks, frames and metaphors used to describe observations and formulate causation. Even the purity of mathematics requires metaphorical translations in order to apply its logical formulations to reality.



While Lewin’s observation of the utility of a good theory is without a doubt true, the relationship between the process of empirical observation and abstract theorizing remains an ongoing and intense arena of scientific and philosophical conversation. In general, what people are really good at is less about being ‘inherently rational’ and much more about an inherent ability ‘to rationalize’ – to apply forms of reasoning to make sense of things. In order to make the world sensible we can integrate even the most unpredictable surprise into coherent intelligible effects of previous causes.

While we are often capable of constructing an explanation of a causal sequence after an unexpected event has happened, it is often impossible to identify what will trigger such event ahead of time. The explanations that we create after an event can make the cause-effect sequences seem inevitable – but such explanations don’t tend to account for any number of equally possible outcomes that didn’t materialize simply because of chance. What happens in history and life are not randomized clinical trials with adequate control groups. As Stuart Kaufman has noted – life creates the conditions of its own becoming – and that means the future will always be one of Volatility, Uncertainty, Complexity, and Ambiguity.

Another key issue in the perception of causality is captured in the classic concept of gestalt of figure-ground relationship. Marshall McLuhan suggested that it is impossible to reveal the ground that makes visible the figure of our awareness, without making the ground we want to reveal a figure in itself, and thus creating an additional (and invisible) ground. What is required to make the ground visible –according to McLuhan it to construct an ‘anti-ground’ an anti-environment. The concept of ground (as a Gestalt figure-ground relationship) is a synonym for environment. One more point that McLuhan makes is the suggestion that whenever a ground becomes the figure in social awareness, it is always perceived as a monster (I will be referring to this observation in following posts).

All complex and living system require some form of constraints. As Kauffman says below – constraints are what enable the release and harnessing of energy and information necessary to perform work.

The first surprise is that it takes constraints on the release of energy to perform work, but it takes work to create constraints. The second surprise is that constraints are information and information is constraint.
Stuart Kauffman – in Deacon (2012).
Terrence Deacon makes a very comprehensive case that constraints (as causal agents) restrict or confine systems or situation with boundaries. Often they are not perceivable as what is visible – but have to be conceived as ‘what is not there but could have been’, as alternatives/choices that are unable to be undertaken. Constraints can arise inherently within a system or from its context. Dynamic systems are constrained within relevant degrees of freedom and tend to reflect attractor qualities.

The Internet of Things, the digital environment, the data-informational atmosphere are emerging as a new ‘ground’ of an emerging global civilization. I’m going try to understand the implications of the digital economy by exploring some of the seismic shifts in the ‘ground’ of our economy, which are also transforming our society. In the next few posts I will engage in what I believe is a plausible line of reasoning that can anticipation or at least let us imagine how information and identity are shaped by and shape fundamental social constraints. What I want to do is extend this line of reasoning in order to understand how the emerging digital environment is a new ‘ground’ that will shape or require a corresponding concept of self and construct of identity.

Scientific Thinking in Business
http://www.technologyreview.com/news/523661/scientific-thinking-in-business/

Why everything that seems obvious isn’t
http://everythingisobvious.com/wp-content/uploads/2011/12/PWOct11watts1.pdf

Kurt Lewin (1952, p. 169). Field theory in social science: Selected theoretical papers by Kurt Lewin. London: Tavistock.

Monday, December 22, 2014

Attractors of the Digital Environment – Constraints for Work

Transaction Costs and Organizational Structures - Why Does the Firm Arise?

In the last post I briefly introduced some aspects change related to intensive properties and the concepts of formal cause and attractors. I want to now extend this line of thinking in order to explore the future of work.

Moore becomes Different 
In the first post of this series “When Moore Become Different”, was setting the frame that sometimes changes in quantity can produce dramatic qualitative change. In this post I’m going to make a case that the exponential rates of technological innovation that are shaping the emerging digital environment are enabling a ‘more is different’ phase transition from an organizational attractor of the last 5,000 years to a new organizational attractor. The traditional attractor is one that has favored centralized, hierarchical ways of organizing large scale human effort – the attractor of the digital environment appears to enable the scaling of distributed, networked forms of organizing.

William Gibson coined the term ‘cyber-space’ and his work inspired the emergence of a science fiction a genre called ‘Cyber-punk’. He noted that the best thing one can do with science fiction today is use it to explore the present. In the 60’s he felt that ‘the present’ lasted for about 3-4 years – since within that time little would change. But by 2010, his feeling was that big changes were a daily occurrence:

"Now the present is the length of a news cycle some days… The present is really of no width whatever."
For Gibson, the tricks which earlier generations of science fictions writer employed, to extrapolate current technology and imagine future worlds, are becoming ever harder to use. He suggested that real world event tend to overtake anything a writer can conceive, before their book can find its way onto the shelves and screens. As he notes:
“I have to figure out what it means to try to write about the future at a time when we are all living in the shadow of at least half a dozen wildly science fiction scenarios."
Gibson’s observations highlight just how fast things are changing and how difficult it is to extrapolate today into a vision of tomorrow.

Jeremy Rifkin notes that societies undergo fundamental transformation when technology revolutionizes how energy is harnessed, communications is enabled and transportation is coordinated. He outlines an emerging technology platform/infrastructure that includes: the Internet-of-Things (IoT) being built through the rapidly developing Communication Internet; a nascent Energy Internet; a Logistics Internet and new paradigms of manufacturing.

The IoT will connect everything and everyone. The IoT includes uncountable sensor attached to resource, production, energy, logistics networks and flows – which in turn are producing unimaginably large data trails for analysis. Included in this emerging fabric of the digital environment are the unprecedented possibilities of distributed crypto-applications like the Bitcoin blockchain. The IoT of sensors, may make it impossible to ever again contain our personal information within the boundaries of previous concepts of privacy.

I will start first with an exploration of transaction costs as inevitable constraints that are necessary to do work.

Constraints and Work
All complex and living system require some form of constraints. Kauffman provided me with a profound flash of insight in relation this that constraints are what enable both the release and the harnessing of energy and information necessary to perform work.
The first surprise is that it takes constraints on the release of energy to perform work, but it takes work to create constraints. The second surprise is that constraints are information and information is constraint.
Stuart Kauffman – quoted in Deacon (2012).
While there are a number of important constraints that I want to talk about – in this post I will focus on the basic nature of transaction costs as fundamental in shaping certain social-structural constraints. Ronald Coase won a Noble in economics for asking “Why Does the Firm Arise” in a market economy why are people gathered under one ‘umbrella’ to get things done”.

These costs involve the material, effort, attention, energy, time and information flows embedded in the production systems (including search, negotiating terms, coordination, enforcement and communication). The distributed and decentralized ‘anarchy’ of market systems began to be more efficient only when the population increased in size and density and communication/transportation systems dramatically reduced transaction costs and increased complexity through enabling ever more specializations and divisions of labor.

So despite the ‘efficiency’ of potential market systems, it was by sharing purpose; dividing labor; and providing control through establishing roles, responsibilities and methods of communication – it remained both less expensive and easier to get things done. The hierarchic organizational structure was efficient despite the emergence of ‘efficient’ market systems.

For example, it is costly for everyone (in time, effort, money and resources) to search for employees or work opportunities, to constantly negotiate enforceable terms and compensation, and to coordinate collective effort toward common purposes (principal-agent problem). Longer-term contracts, divisions of labor, and other structures of the organization enabled significant savings in costs as well as higher productivity.

But efficiency is not the only constraint, according to Douglas Allen, there is another important dimension of the constraint inherent in transaction costs. In Allen’s reading of Coase, transaction costs are the costs (time, effort, money, opportunity) necessary to establish and maintain any system of rules and rights. These rules and rights in turn define what institutions are – systems of rules which also mean that individuals and organization are merely the players/agents in the systems of rules (although they all inevitably try to shape the rules in their favor).

The implication of this is that transaction costs also include the costs of mitigating bad behaviors. According to Allen, institutions emerge to maximize wealth and minimize the costs of establishing and maintaining themselves.

Coase also noted some ‘human’ reasons for a hierarchical organization in a market system, such as:
  • Some people prefer to work under direction (to follow) and are prepared to accept the corresponding conditions & restrictions; 
  • Some people prefer to direct others (to lead or manage) and are willing to accept the responsibilities and costs related to this role; and 
  • Some people prefer goods produced by firms. 
These more human constraints are and will continue to feel the disruptive impact of technologies shaping significant socio-cultural expectations – I will explored this more deeply in later posts focusing on issues related to the shaping of identity.

The effective upper limit on the size that a traditional organization can reach and continue to be efficient, is the threshold where the internalization of transaction costs begins to exceed the transaction costs of a market situation. As the organization gets bigger there will be diminishing returns to efforts to create more efficient management regimes. This is a very important threshold – as the digital environment continues to collapse the traditional parameters of transaction costs this threshold gets increasingly lower.

The evolving digital environment is enabling a profound collapse of traditional transaction costs – including the near zero marginal cost economy that Rifkin describes – a phase transition representing a changes in the conditions of change. We can see this already with the emergence of new platforms of productivity, new ways to get things done and new types and varieties of exchange.

The costs of new patters and rates of interactive exchange, as well as those associated with search, negotiation, enforcement, coordination and communication continue to collapse. This means new and different types of intensity thresholds (density, connectedness, etc.) are instigating a massive societal phase transition.

Throughout the history of human civilization (after humans had become agriculturalist and before the advent of the digital environment), there have been environmental constraints shaping transaction costs. These transaction costs determined the attractor of efficient organizational structures, which have until the digital environment been hierarchic.

We know that for most of human history we have lived in small group of hunter-gatherer what have always been relatively egalitarian (not as we would define equality, but focused on constraining hierarchic social structures). The tendency to hierarchic structure has been much less a result of human traits and much more the result of the attractor of organizational efficiency.
We are in the midst of transitioning a fundamental threshold enabling new types and varieties of exchange through an exponential decrease in transaction costs – including those related to patterns and rates of interactive exchange, as well as those associated with search, negotiation, enforcement, coordination and communication. In this way, different types of intensity thresholds (density, connectedness, etc.) are instigating a massive societal phase transition.
The digital environment entails the emergence of new constraints. As these new constraints become more established and widespread (e.g. approaching near zero marginal costs) they will determine a different attractor of organizational efficiency. I think of this new attractor as social computing. Jeremy Rifkin in his book “Zero Marginal Cost Society” refers to the emerging ‘collaborative commons’.

Ever since Adam Smith’s elaboration of the division of labor with his
example of the ‘pin factory’ we have known that economic prosperity and productivity gains are founded on the division of work into ever smaller units. Correspondingly the increase in population density of urban life has enabled the sustainment of ever more specialized work and workers. In essence, this theory proposes that the digital environment establishes the conditions that not only are increasingly enabling but will inevitably induce hyper-connectivity which in turn enable/induce a hyper-differentiation-of-labor/specialization.

The paradox of increasing specialization and hyper-differentiation is a corresponding dependence on exchange – and in the digital environment this means hyper-exchange. The result is an acceleration of knowledge flow – a hyper-knowledge metabolism. Thus, the phase transition inaugurated by the digital environment will require new design principles for harnessing human capital, new institutions, new social structures and a new political-economic philosophy.

The issue of new institutions is profound. Douglas Allen argued that fundamental institutional revolutions emerges that was driven and enabled by new technology that allowed people to mitigate the randomness of nature. For example the steam engine allowed energy to be produced more reliably and ships to be able to navigate with much less dependence on natural forces (e.g. wind and tide). Inexpensive time pieces enabled coordination of activities across all social domains and time zones. New capacities to measure behavior created new means to manage systems.

Jeremy Rifkin outlines many key transformations emerging today. For example, Solar & Wind are following a Moore’s Law type trajectory suggesting that it is not if, but rather when, we get cheap abundant energy. The Internet of Things (IoT) is enabling the emergence of whole new paradigms of logistics, communication, organization and coordination. The emerging of 3D printing will (sooner or later) change the way we make and distribute goods. Advances in AI & robotics (how smart and/or capable will Baxter the robot, be in 2025 – especially when Baxter is connect to Watson?) suggests that whatever can be automated will be and whatever involves information will be disrupted.

The emergence of Augmented reality will transform pretty much all professions. How soon will it be, when you won’t trust your doctor, because she isn’t wearing a device that connects her to Watson (who’s read all the medical literature up-to-the-minute and can diagnose/compare state of the art treatments), to augment her knowledge with AI analysis.

These advances, in technology can’t just be overlaid onto existing social and organizational structure and be expected to provide optimal benefits including increased productivity. For example, I think of the tremendous value that Wikipedia has and continues to create, yet remains outside of any current measure of any nation’s GDP? How could this value be distributed properly to the creators of this value (both those add and curate the content and those who use it to add value elsewhere)?
The ‘message’ (change in scale, scope, pattern, pace of behavior enabled by a medium) of the digital environment as social computing means that social computing is displacing the architecture of industrial machine and its hierarchical management structures. Examples of social computing increase daily but include Wikipedia, Uber, Open-Source, FoldIT/eteRNA, etc.

Essentially social computing is the capacity to assemble knowledge networks (my assumption is that knowledge lives only in people and their living networks) as and when needed. The future now is less about the assigning work to the ‘person in the job’ and much more about connecting the ‘person best able’ to add value to the work-at-hand. But how do we enable the new organizational, and participatory architectures with the requisite social and governance institutions? It is no simply about liberating working in an unchanged society.

What does a society and the world economy at large need to harness full human wealth – to bridge the ingenuity gap? We fundamentally need to re-think how to distribute the gains of productivity. Especially since more and more of traditional productivity will be delivered via AI and automation. We need a 21st Century social platform.

Key Points:

· The costs of coordination, search, transaction, and (self) organizing are collapsing:
  • The pool of knowledge outside the organization is larger than the pool inside, crowdsourcing = employing countless workers 
  • Agility requires architectures of participation and new ways to design how things can get done 
  • Whole systems are characterized by flows between and among component parts – knowledge metabolism 
In the next post, I will begin to explore the nature of social and system constraints on identity, for the development of institutional innovations that will demand us all to rethink the necessary constraints for the creation of value in the digital environment.

References:
Coase, Ronald. 1990. “The Firm, the Market, and the Law”. University Of Chicago Press.

Allen, Douglas. 2011. “The Institutional Revolution: Measurement and the Economic Emergence of the Modern World”. University Of Chicago Press.

Rifkin, Jeremy. 2014. “Zero Marginal Cost Society: The Internet of Things, the Collaborative Commons, and the Eclipse of Capitalism” Palgrave Macmillan Trade.

Tuesday, December 16, 2014

Causality or Why Things Change

Thus far I've outlined some fundamental ways that 'things can change' and some conditions that favor certain ways of changing. In this post I want to briefly discuss causality, in order to complete the basic groundwork for an ongoing exploration of the future.

Aristotle listed four types of causality– Efficient, Material, Final and formal. I’m going to assume that the first three are familiar enough to people so I won’t provide much explanation.

Efficient and Material Causation is the basis of modern logical positivism and provides the most fundamental assumptions for scientific explanations including the social sciences. These two ways of understanding why change propagates, have provided the primary foundations for the scientific understanding of the world and produced spectacular success in driving scientific and technological progress. However, there remains much to reality that is beyond the explanatory framework they provide.

Simply put, efficient causation represents the effects that we see from the impact of applied energy/force. A simple example is the resulting behavior of billiard balls on a billiard table when someone strikes the cue ball against the racked set of billiard balls or the way chains of dominoes can fall in cascades when they are arranged in appropriate patterns and proximity.

Material causation on the other hand, represents the causes arising
from or determined by the nature of the material involved. For example, the properties and mass of granite are significantly different from wood entailing correspondingly different causal consequences when they interact with other types of matter. Materials sciences have made striking progress by understanding the basic properties of material by themselves and in combination.

Final causation, relate to intentions and represents causal aspects arising from the aim or purpose being served when changes are set in motion. For example the final cause of the building of a sail boat might be sailing, or it might be simply wanting to own a sail boat. Although Aristotle may have said that the final cause of a ball at the top of a ramp, might be coming to rest at the bottom – we would now understand that as efficient cause.

What very important to consider when engaged in foresight efforts is that people will change their plans and intentions in response to what they perceive. This so often is forgotten when we model trajectories of change and it is why it is so important to create various scenarios that can incorporate consequences of decisions and actions that are responses to the current and potentially future conditions. And in fact, we must be very careful in trying to apply assumptions of motivation to our visions of the future. The most classic fatal flaw is the concept of rational actor as applied to neo-classical economic models of forecasting and policy development.

The fourth cause – Formal Cause – is more difficult to explain. However, formal cause may have a great deal to contribute to the ongoing development of complexity sciences and the understanding of 'emergence' as a type of unpredictable change. Formal cause also can contribute to an understanding of complex change that can take place on longer time scales.

Some have defined formal cause the ‘essence’ of what it is to be something. A simple illustration of all four causes will help. Take a person making a statue. Efficient Cause explains the contact of the chisel with the marble. Material Cause is the marble itself. Final Cause is the end (the vision of the sculpture) that the person sculpting has in mind. Formal Cause would be the essence, idea or quality of what a statue is – it’s statueness.

However, this particular meaning of formal cause seems insufficient today. Marshall McLuhan noted that: “Formal cause is still… hugely mysterious. For the literate mind formal cause remains too paradoxical and irrational. It deals with environmental processes working outside of time”… that “effects precede causes”. For McLuhan, formal cause is like the “bright light of the future casting shadows on the present, from forthcoming events.” It remains invisible in the contextual, interactive, ‘ground’ of an event, yet enables unexplainable effects. 

Formal cause is made more difficult to perceive since all four types of causation work simultaneously and we have been educated and conditioned to think primarily in terms of efficient & material causation. We can only think of causes arising in the past and entailing a sort of ground up chain of effects.

Stuart Kauffman has recently been exploring what he calls ‘formal cause laws’ as a means to develop better theoretical explanations of complex adaptive systems and emergence. Kauffman argues that when we are thinking about biological and evolutionary systems it is not possible to pre-state the niche boundary conditions.

As Kauffman explains it, changes in the niche landscape do not ‘cause’ evolution but rather ‘enable’ it. Kauffman provides a simple example. It is easy to imagine the traditional causal network that would result in the development of the swim-bladder of a fish. The evolution of the swim bladder provides the fish with more fitness to compete and reproduce in its particular environment. However, when a microbe settles into the swim bladder turning it into a new niche environment there is no direct causal mechanism that set this in motion. The swim bladder ‘enabled’ an affordance or opportunity (or as Kaufman calls it an adjacent possible) for the microbe to seize.

One could not predict that a microbe would create such a new niche - there was no sort of causal trajectory that would allow us to anticipate such an affordance. The bladder-as-niche was not within the pre-determined space where a trajectory between microbe and bladder could be calculated like billiard balls on a pool table.

With this perspective we can easily see that the emergence of a
self-reproducing organism/organization cannot be reduced to traditional efficient or material cause entailing laws. Kauffman proposed that emergence of this sort, is based on ‘Formal Cause Laws.’ In this way Kauffman suggests that Formal Cause laws for a theory of Complex Adaptive Systems would arise from the ‘whole situation’ – whether it is an ensemble of random networks of chemical reactions or a growing economic ecology.

Another way to imagine the working of Formal Causation is with an example of an attractor. While there are many types of attractors, they can be simply defined as a set of physical properties toward which a system tends to evolve and propagate, regardless of the starting conditions of the system. 

Attractors display recognizable trajectories which can be periodic or chaotic, and the only constraint they have to satisfy, is remaining on their particular pattern of trajectory. Manuel De Landa uses an example of the behavior of a soap bubble as shaped by an attractor. No matter what the initial shape of the bubble wand is – the bubble will tend toward a perfect sphere – depending on air currents. 

The explanation for this tendency is that the sphere minimizes the surface tension of the soap film. However the ‘cause’ is not surface tension as much as it emerges from the entire dynamic system of soap film, water, atmosphere pressure and other environmental influences. The attractor can be likened to the Formal Cause inherent in the whole situation that shapes the soap bubbles in a trajectory of ‘wanting’ to become perfect spheres. 

Kauffman believes that Formal Cause Laws are a new class of laws arising from the generic behaviors of large assemblages of systems that can provide a better foundation for understanding the sciences of complexity, emergence, evolution and even economics. I can't help but think that Hegel famous quote "the truth is the whole" (Preface, Phenomenology of Mind 81-82) is very suggestive, as is Whitehead.
Every entity is to be understood in terms of the way it is interwoven with the rest of the universe.
Alfred North Whitehead 
In my next post I want to extend this discussion about formal cause and attractors towards a more relevant understanding of the future of work. I would like to suggest a different way to think about the current trajectories of disruption – the shift from centralized, hierarchical way of organizing human efforts towards more distributed, networked forms of organization. This will help set up some ideas about the future of identity.​

References:

McLuhan, Marshall; McLuhan, Eric. 2011. “Media and Formal Cause”. Neopoiesis Press.

Kauffman, Stuart. 2013.”Beyond Reductionism Twice: No Laws Entail Biosphere Evolution, Formal Cause Laws Beyond Efficient Cause Laws.”

Manuel Delanda. 2011. “Intensive and Topological Thinking” European Graduate School.


Friday, December 12, 2014

Intense Conditions of Change

In the previous post I elaborate some basic ways that things change. In this post I’m going to discuss another fundamental aspect of change – an aspect of the physical condition. In broad terms the measurable world is divided into extensive and intensive properties. The extensive properties include the basic ways we would measure a physical object – length, width, breadth, volume, mass, etc. Change of extensive properties often represents a change in amount or shape of whatever is being measure.

Intensive properties are measurable domains such as temperature, pressure, density, connectivity, conductivity, viscosity and malleability. These sorts of properties are not additive in the way extensive properties are. For example to make something hotter one either has to add some more of the same material or stretch the material (likely making it thinner). But making something hotter requires an increase in energy but not any addition of material.
What really interesting is that phenomena that are measurable by intensive properties are subject to a particular type of change called Phase Transition.


A phase transition is a very dramatic type of change within a very narrow band of measurement. Such as when water turns to ice – which happens as a singular threshold of zero degrees Celsius. Two completely different ‘substances’ or ‘conditions’ are evident on each side of 0 degrees – on one side is a fluid and on the other side we have a solid.

The curve of a phase transition can look almost identical to the curve of an exponential change, phase transition’ is different in that it is a fundamental change in the nature of a medium that seems to happen magically at a single point of transition. This type of change is very difficult to anticipate unless we have already experienced it.
Trend analysis does not prepare the observer for this type of change.

Another example occurs in some types of dynamic system when a phase transition threshold occurs resulting is proliferating bifurcation. What happens in this case, is when an apparently, smoothly progressing change reaches a certain value and the system experience a sudden qualitative or topological change in its behavior. The graphic illustrates better than words.

This first graphic is the classic bifurcation graph. 




This second is a bifurcation fractal. 




Why are intensive properties and phase transitions important to understanding social and technological change? 

Among many properties describing human societies, population density and connectivity represent fundamental intensive properties with significant social and structural implications. What does a phase transition look like in a social context? As human populations experience increases in density we see critical phase transitions that enable divisions of labor to proliferate – in ways that create new types of societies.


For example, when humans were hunter-gatherers local groups generally never exceeded a population of 150-250 – a density that can only sustain very rudimentary divisions of labour (e.g. elder, adult, child, male-female, hunter-gatherer, shaman-healer, etc.). Social groups with this level of density (and number) are also vulnerable to easy loss of specialist types of knowledge gains. If one or two people become more specialized because of innate talent as well as having time/space to develop a particular skill/knowledge then the group can become dependent on them. If one or both are lost through disease or death – that knowledge is lost permanently or has to be developed anew.

As humans became agricultural societies – local groups were able to increase population densities (by exponential amounts in some cases). This enabled a phase transition where many more permanent divisions of labor, and whole new occupations arose, each occupation also becoming a domain of specialized knowledge – which enable unprecedented new ways for a person to ‘be’ in society (e.g. a shoe-maker, tailor, baker, herbalist, etc.).

New institutions become necessary as well. The agricultural society was more than a large gathering of hunter gatherers who could farm. Agricultural society required a new institutional framework with many new institutions, and conventions. Along with increases in population density and new division of specialized labor came a necessary increase in the levels and types of exchange – in turn creating new forms of interdependence.

The rise of civilization was a phase transition enabling and enabled by the increased population intensities of large city-states. A similar phase transition occurred in the course of the emergence of the industrial society – exponential rise in population density, more levels of specialization, more exchange – whole new institutions (e.g. enablers for the governance of market and democratic political economies, public education, impartial justice, etc.).

Other types of ‘intensity’ create conditions for changing the conditions of a social context, even those, with a relatively stable population density. For example, the emergence of new communication technologies that enable increased densities (through the sense of collapsing distance) and/or increased connectedness. In the industrial era collapsed distance with the development of the steam engine, rail-based transportation and the emergence of the telegraph.

Understanding the impact of increased population, connection and communication ‘densities’ can help us to imagine the potential of the emerging digital environment. Social media has been described as an exponential increase in ‘density’ of communication and connectedness. However, the rapid evolution of the Internet and the Internet of Things (IoT) is more profound than the impact of social media.

At this point – it seems I have fallen into a line of logic that impels me beyond the elaboration of just intensive properties and into an elaboration of the future plausibilities of the intensiveness of the digital environment. So I will continue – but I will next discuss in more depth concepts of Causality.

The digital environment is fundamentally disrupting the industrial economy, its institutions and its organizations, by enabling conditions that inevitably favor hyper-connectivity which inevitably leads to a hyper-division-of-labour (or hyper-specialization). This in turn entails a requisite hyper-exchange. Together hyper specialization & exchange produce a hyper-knowledge-metabolism.

I want to make the comprehension of the intensities of population, communication and connectivity into a McLuhanism:
If Social Media is the Medium,Then Social Computing is the Message.This entails that Organizations cannot be Architected as ‘manufacturing machines’.They must now be Architected to be Programmable, Complex Adaptive Systems. 

For McLuhan, a Medium was anything that extended the mind, body or senses. By this definition a Medium could be a new technology, process, idea or original creative work. However, for McLuhan the message of a Medium was not its contents. The message only becomes clear in the resulting differences in human interactions and activities. This means the message is perceived in the differences that arises in changes of scale, pace, scope or pattern that a medium causes in us as individuals or as a society or culture.

These changes (Message) are distinct from the content of the Medium. Thus it is not the information conveyed through the Medium but rather it is the Medium’s ability to change the way we act or perceive, that is the key factor that enables us to realize and understand a Medium.

Social computing then is the capacity for a large network or ‘swarms’ of people to explore in parallel, a problem space and produce a range of effective solutions, and/or produce a good or service. Examples include Wikipedia, the many Open-source initiatives and the increasing use of crowdsourcing, and crowd-funding approaches as new modes of production.

As swarming suggests, social computing is self-organized collaboration, that depends on participants performing without centralized direction of coordination. To perform without centralized coordination means that participants have to be increasingly self-directed and socially aware – e.g. responsibly autonomous. Social computing then as self-organized collaboration requires a different organizational operating system – such as ‘network individualism’ proposed by Barry Wellman.

A programmable organization depends on social computing for the rapid and agile generation, assemblage and harnessing of knowledge networks, as and when needed. In using this phrase we assume that knowledge (as opposed to information) only arises in embodied minds. In this way a programmable organization enable the assemblage of people to harness increasingly specialized skills, talent, knowledge, and motivations.

What all of this implies is that the traditional ways of organizing efforts – through re-configuring and retooling hierarchical organizational processes, or re-architecting traditional management/leadership structures, occupational frameworks, job descriptions, etc., will no longer be adequate for survival.

A programmable organization has to rely on the responsible autonomy of its participants and their capacity to embrace networked individualism, as a social operating system. For example, a computer is a general purpose programmable machine. As such it doesn’t require a hardware reconfiguration in order to run innumerable applications, rather such a system only requires a different set of instructions. This is unlike the many current industrial enterprises which must reconfigure their organization charts, reporting relationships and physical accommodation in order to enact organizational change.

A digital and human medium of hyper-connectivity, propagates a phase transition to hyper-divisions of labor, hyper exchange & hyper knowledge metabolism –  a form of “hyper conversation.” This implies a faster rate of innovation, as the more we can exchange the more likely we are to experience serendipitous insights & inventions.

In the emerging world where everything that can be automated will be, the emphasis on insight, invention, and innovation forces us to think about what are the activities that humans tend to be best at. If we can meet this challenge we will be in a position to create new forms of wealth – the wealth of people.

This line of reasoning has taken me from an effort to elaborate the concepts of intensive properties and their social consequences in the 21st Century. However, before I can pursue the implications of this line 0f reasoning on the social construction of identity, I will have to explore some concepts of causality – the nest blog. 

Reference
Wellman, Barry; Rainie, Lee. 2012. Networked: The New Social Operating System. MIT Press.