Monday, March 19, 2012

the matter and anti-matter of information


I've just finished reading a wonderful article by David Hunter Tow- Director of The Future of Life Research Centre - 

I wonder if Tow has read Kauffman (I know I seem to be obsessed with Kauffman these day, but...) the views that are expressed in Tow's article are consistent and maybe a logical extension of them. Below I use italics to indicate where I have quoted from Tow's article.

Before I start paraphrasing Tow, I'd like to elaborate some of my thoughts on the nature of information. Gregory Bateson, defined information as 'difference that makes a difference' or 'news of difference' (that makes a difference). Bateson's concept of information lead him to formulate an epistemology of 'pattern'. He contrasted this epistemology from the epistemology underlying the standard scientific - physics world views, which he called an epistemology of energy (e.g. that which can describe the events of billiard balls on a pool table). As Kauffman has confirmed - this epistemology is inadequate to describing complex, living and evolutionary systems. 

What is fascinating about Bateson's views of information and its corresponding epistemology of pattern is that pattern includes the world of 'meaning'. That is meaning which is a non-energetic relation of cause-effect. That doesn't mean energy is not involved, but what it means is that energy is not the cause of the reaction. For example - a man says he loves a woman - the energy involved in the exchange is simply the auditory energy that causes sound vibration to trigger sensory mechanism in the ear of the woman. However, the reaction to those words - e.g. the meaning - arises from the woman's own energy - she may swoon with rapture and become repulsed with horror. The 'meaning' arises as difference that really has no energetic base (even though the woman's brain must generate energy no matter what the response - it is not the energy itself that determines the response). 

It is not the energy that is transmitted in a printed word that stimulates a reaction it is the meaning of those words that stimulates action. Yes energy is involved but does not determine the particular action. In fact it is possible for the exact same amount of energy to be involved in completely different actions stimulated by differences of meaning 'which' in of itself has no energetic base. Although meaning is transmitted through a medium (which involves energy) it is not the energy itself which 'contains' the meaning 

Another example Bateson used is useful (I was going to say fun - but I don't want animal lovers to get me wrong). If you kick a dog - the subsequent behavior of the dog obeys the 'laws of physics' only for a short time. The next reactions of the dog will depend on the 'meaning' the dog creates for the kick. In essence, the dog will process the meaning at least through the four 'F's' - should the dog Fight, Feed, Flee or F*ck the kicker? This reaction can't be approached via an epistemology of energy.

Let's try another example, stem cells are 'pluripotent' (essentially they can take on many forms - so that the same 'cause' has potentially many different 'effects'). The choice of what form it takes has to do with the context it is in - chemical/molecular gradients, etc.). Here differences are vitally important but these differences can't be assessed by caloric variables, rather there are complicated multiple gradients (patterns) that are differences that make a difference. One could probably take different situations whereby the average quantity of other molecules, and energy are equivalent but are ordered differently through the gradients. What produces different outcomes for the stem cells therefore are the pattern of gradients rather than the overall energy and molecular quantities.

This is much like what can happen when water turn to ice - it is not the actual amount of energy change that 'causes' a phase transition - but the contextual condition that 'enables' the phase transition. A change of one degree of temperature from 99 to 1 degree celsius is indistinguishable. But the identical change of energy difference between 1 and -1 degree celsius enables a change in the conditions of change. This change arises not simply by a unit of energy change but arises in the medium itself (the change could be argued to be 'self-energizing'.

Bateson went on to discuss the inherent 'non-materiality' of information (as constituted by differences & patterns of differences). Fundamentally an actual 'difference' is itself non-material although it does require mediums to propagate and through various transformations (e.g. Shannon's Theory, being all about ensuring that the various transformations do not gain or lose the initial 'differences' or pattern of difference as information). 

The key point for Bateson is that while energy (and matter) definitely carries information - information itself is not the energy. Pattern can undergo all sorts of 'transformations' through various media and retain a fidelity.

In responding to meaning-patter it is not the caloric content of the message that is important - it is the pattern upon which the living entity organizes a meaning that is important. The expression of a message of love can be 'perceived' through many forms (touch, sound, sight) regardless of the caloric content.

While meaning is not 'free-floating' of dis-embodied, etc. but neither is it measured or reducible to energy or matter. Pattern itself is embodied but is non-material in that it is transformable through all sorts of matter-medium. To send a signal - is to enable a pattern to be transformed (with fidelity) through various media (matter/energy). What is send/transformed is not the actual matter-energy but a non-material pattern. Pattern in turn has 'meaning' only in relation to context and the entities involved.

The notion of a self-energized 'pattern' only relates to living systems and their action to the world/context they organize. Thus a soft kick to a dog can produce a bite or indifference - the intensity of the of the dog's response may have less to do with the 'energy' transferred' by my kick. The response is 'self-energized' dependent on the meaning the dog 'attributes' to my kick. 

Thus while, differences-as-pattern need to be-become embodied - they themselves are not the material medium that embodies patterns. If information (as patterns of difference) were material these patterns could not be transformed across various other mediums. In a sense this could be used as an argument for another type of 'mind-body' dualism. But what I think is a better argument is the concept that Kauffman calls 'enablement' information-matter are mutually enabling merging the virtual and the concrete.

One of the interesting questions that arises from this view is 'Where do new patterns come from?"

Thus, what is very interesting about the idea of life arising from capacities of re-cognize and transform information - is that the information itself is completely non-material.  

Tow notes:
However adaptive living systems may be primarily differentiated by their capacity to utilise and process information by storing, monitoring and transforming it. Information is coded, stored and processed in the neural network structures of the brain and nervous system, the DNA, RNA and protein structures of the cell including its microtubule scaffolding, as well as the myriad other chemical, sensory, signalling and metabolic feedback loops that allow life to function within a complex environment.

By transforming information, life evolves towards greater complexity. The more complex life becomes, the better it’s able to learn, adapt and continue its trajectory in the universe.

There is an inter-being implication of a co-determining immaterial message, medium and transformational events.   We as humans may be on the threshold of understanding the vast scales and scope of the nature of the message, the medium and the events. 

A new twist has recently added an extra and crucial dimension to this story- the discovery of a form of life based on inorganic chemistry. Up until now, all life on earth has been assumed to be based on organic biology- carbon in the form of amino acids, nucleotides and sugars etc. But Professor Lee Cronin at Glasgow University has engineered a form of self-replicating, evolving, non-carbon based cell with life-like properties. It has opened the possibility of creating micro organisms from inorganic chemicals- proving that evolution is not just a biological process. This suggests that there may be many alternate sets of non-carbon life forms and that the evolutionary principle may have much more general application than previously understood. 

Applying a more abstract concept-theory-philosophy of life we can 
hypothesize that life is a universal emergent phenomenon that occurs in systems that are far from equilibrium, as Ilya Prigogine originally proposed. 
While evolution has been understood as a fundamental physical process that giving rise to biological phenomena, this understanding has limited our view of life & living systems within an original biological straightjacket. Life as it has evolved on Earth, may be just one local solution or instantiation within a vast landscape of possible options, biological or otherwise - we have studied only a tiny fraction of a much bigger picture. 
In this way we can understand that the trajectories of human science and the corresponding spaces of adjacent, enabling, possibles opens up new instantiations of that immaterial message that is life.
Now science’s Pandora’s Box has been opened to release three players in the great game of life- biological, synthetic and virtual. All three will have to learn to co-exist and accommodate with each other, As the biological, technological and social barriers dissolve they will eventually merge into a new entity- Meta-life- a universal autonomous manifestation of intelligence. 
These developments clearly demonstrate that life is now at the threshold of its most significant transformation. In the near future, the links between AL, SL and BL will be virtually seamless. Powerful AI techniques in the form of neural nets, swarm systems, fuzzy logic and evolutionary algorithms, will merge with the massive pattern analysis capacities of the Web’s computational intelligence; complementing human decision making at a fundamental level and creating a permanent nexus. 
Another important implication of understanding that life is not the matter it is constituted by (but embodied in) is that the Universe has not adapted to the highly improbable set of conditions essential to triggering our form of life. Instead, life has adapted to the physical conditions set by the universe- which itself has adapted to the physical and chemical properties of the much larger multiverse. The larger multiverse is more than the past's bifurcations of moments of choice - but also an ever widening possibilities space.
Biological processes might in fact be the means of selecting those laws of physics that best boost life’s own survival. The prerequisite conditions for such opportunistic systems will narrow down the range of possible structures to a small proportion of available configurations, including but not limited to those in which the laws of physics support life as we know it. 

I think this view is important 'qualifier' when we undertake to understand the notions of sustainability, of life, of evolution, of ecologies - if only to to think about these thing in ways that are less vulnerable to human-centric nostalgic romance. As McLuhan remarked (in the 60s) when an environment is new we perceive the old environment for the first time - more particularly: 
It is peculiar to environments that they are complex processes which transform their content into archetypal forms. As the planet become the content of a new information environment, it also tends to become a work of art. Where railway and machine created a new environment for agrarians, the old agrarian world became an art from. Nature became a work of art. The Romantic movement was born. When the electric circuit went around the mechanical environment, the machine itself became a work of art. Abstract art was born.

Are we are entering an age of abstracted life?

Friday, March 9, 2012

Science and Mythos - Toward a New Mythos


 I would like to thank my colleague Paul for the conversation that enacted this moment of social thinking.

I shared with Paul, the new papers by Stuart Kauffman
Answering Descartes: Beyond Turing and No entailing laws, but enablement in the evolution of the biosphere

Paul replied very quickly on receiving the papers but before he had read them. He like I is deeply interested in the topics of evolution and complexity. He is a real intellectual and curious about the world. He spoke about his question of whether the laws of physics are the only laws of motion because if that were the case - then of course the future should be fairly determined & thus predictable.

But even if such were the case Paul argued that we subjectively experience it differently - an experience that many scientist reject as 'noise' despite the fact that 'being subjectively objective is also a subjective experience. :)

Paul recalled reading about a Cambridge University debate between a theologian and an atheistic natural scientis about the existence of God. He believed the atheist/scientist won since in the end the theologian could only resort to 'I believe'. However, he felt that this may have been a pyrrhic victory that describe a universe without a concept of a god that humans could use to reflect back in wonder at themselves.

Later at a social science event he heard another scientist report on 'religion' - what struck him was the fact that a much greater number of social scientist self identified as atheists compared to natural scientists. Paul speculated that this maybe because most social scientists have onl a superficial understanding of physics. And that further all physicist recognize an element of the mystical deeply embedded in the laws of the universe. 

In thinking about Paul's comments, I think it is interesting to look at the different domains of science – living systems (complex) versus the traditional natural (hard) sciences in terms of the propensity to atheism.

What Kauffman & Longo (I’ve read a couple of his papers) did for me is finally articulate the nature of the frame for a physics worldview versus the frame for living systems & evolution.

I’ve also thought for a long time, that the difference between the premodern and modern worldviews was a nuance of the ‘cause’ of a ‘given world’. So that in the premodern ‘god(s)/God create the world - as it is and we must live in it – and therefore the world was ‘given’ by God.

In the modern worldview – the world is now given by ‘laws of nature’ – the Newtonian clockwork – so for example, once we  know the position/speed/etc. of each part, the world is thus essentially given - the future like the past is determined

In the postmodern worldview the ‘predictability’ of the world’s unfolding becomes impossible despite the its theoretical deterministic quality. This is the result of our understanding of 'chaos and complexity theory'.

In this way, the ‘mythos’ underlying the ‘logos’ of the hard scientist remains similar to the mythos of religion – a given world and the corresponding  metaphorical entailments of the ‘authority’ of God apply to the authority of Laws and can yield to a type of theism or theistic intuition.

Having said this I agree with Paul's "speculative hypothesis that most social scientists have only a superficial understanding of physics." 

However, I felt that the converse is perhaps more important – that most physicists have a poor understanding of the causal logic of biology (and evolution).

Social sciences, including those of living systems (unlike the natural sciences) must accept the inherent and many fundamental unknowables in the nature of their ‘scientific objects’ of study. Although I would bet that many social scientist struggle to do science from within the same ‘mythos’ of the physics world view.

A deep part of this mythos is that physics and its mathematical tool set requires a ‘prestatable (event/phase) space and proceeds to use a language or calculus of trajectory (momentum, force, mass, etc.) to determine what will happen (this is the fundamental conceptual metaphor that frames most science and the sense of objectivity). In this way time (past – future) is revealed.

Here I will refer to a past blog post: The problem with evolution is that it is impossible to prestate the (event/phase) space. Some examples that Kauffman et al use are built on the concept of ‘Darwinian pre-adaptations’. E.g it is impossible to prestate the potential use of the set of three jaw-bones as the mechanisms of the inner ear. Or to prestate all the uses of a screwdriver because it is impossible to know in advance all the possible ‘contexts of selection’ that could use an artifact like the screwdriver. All these unimaginable potential uses are what Kauffman has termed ‘adjacent possible’ none are directly causally determined but they become ‘enabled’ by the existence of the screwdriver. The screwdriver is a ‘pre-adaptation’ of an function that can be unpredictably ‘exapted’ to another function. These real but virtual ‘adjacent possibles are unprestateable thus one can’t create equations of ‘trajectory’ – so a new type of ‘law’ may be more appropriate for the sciences of complex & living systems – a law of ‘enablement’.

My example would be how evolution could select for a sensorium capable of symbolic processing – but language is not selected for – by that I mean that while there are genetic causal linkages to language processing, there is no genetic link between such processing and the unpredictable (and infinite number) of languages that are enabled by the processing capability. Language arises at the level of social interaction with no direct causal relation to genes – there are no English, Mandarin, Hindu, Sanskrit, etc. genes.

Physics/Mathematics provides a language structured by a logic of implication.

It is extremely valuable and successful as a way to describe certain domains of reality. But the map is not the territory and the descriptive logic of implication leaves the actual logic of causality unrevealed, inaccessible. For Kauffman et al, evolution, the biosphere, complex systems require a different logical structure that accounts for radical emergence.

Coming back to Paul's comments regarding the greater propensity for atheism in social scientist – Could it be that they are more likely to be shaped by the underlying mythos of their objects of study? A mythos of an ‘ungiven’ world. A world that (as Kauffman says so well) creates the conditions of its own becoming?

Would this entail a non-theistic mystical intuition regarding the world? A deeper appreciation that we cannot navigate through life but must constantly way-find because each step changes the conditions for the next step and there is no ‘knowable’ territory until we have trod on it? (and who knows how it will change after we move on?).

I think Kauffman et al, have augmented my scientific underpinning for a Buddhist worldview that is sees the world in a highly pragmatic way but also as a ‘sacred’ experience – that guides one to outgrow not only the paradox of object-subject, but the also the mutual arrogance of both science and religion in offering us a belief in certainty about the world we live - a belief that we can be certain, a belief that shapes our efforts to control our lives and paths through life - whether by magic or science.

An unknowable future of an unfolding world, is perhaps a better mythos from which to frame our sciences, and our intuitions of the sacred.

J

Friday, March 2, 2012

A conversation about dematerialization

I'd like to thank my friends Denis Poussart and Amanda Parr for the conversations that I've embedded in this post. Conversation is surely the act of social thinking.

In a recent post by the Rational Optimist - Dematerialising and deflating the future

In the blog Matt Ridley states:
Dematerialization is occurring with all sorts of products. Banking has shrunk to a handful of electrons moving on a cellphone, as have maps, encyclopedias, cameras, books, card games, music, records and letters—none of which now need to occupy physical space of their own. And it's happening to food, too. In recent decades, wheat straw has shrunk as grain production has grown, because breeders have persuaded the plant to devote more of its energy to making the thing that we value most. Future dematerialization includes the possibility of synthetic meat—produced in a lab without brains, legs or guts.

But he concludes with:
No matter how many prizes we offer, certain growing problems—such as caring for children and the elderly, or policing, or repairing freeways—won't experience much dematerialization or deflation. And as dematerialized goods and services like communication get cheaper, these problems will increasingly dominate budgets, damping the acceleration. So the future may be bright, but not dazzlingly so.

As my friend Denis noted, dematerialization is the complement of virtualization. A great example is the progressive transformation of what we call money - from the exchange of tangible goods (barter) to digital information as the medium of exchange (including the bitcoin). Both Denis and Matt make clear that some 'work' still needs physicality - human brawn and embodied knowledge &skills. Furthermore as the Limits to Growth first suggested, humanity is coming to a brink of what we can access related to the production of our material necessities. Looking at the modern excesses of dematerialization it is obvious that bits fed on hope and greed easily self-multiply but it is also patent that they do not feed or shelter. Yes virtualization has brought and is bringing huge contributions to civilization and mankind. But in the end it is not sufficient for survival. Hard stuff matters, too.
I believe that Denis is absolutely right, that the 'hard stuff' matters - in the morning I need coffee - not a virtual representation. ;)

But on the other hand - even the hard stuff is experiencing (and if the nano-bio tech keeps progressing this will accelerate), a human made and manipulated dematerialization phase, in the way material goods become manifest. For example, the domestication of bacteria such that they can become purpose built organisms that manufacture energy, materials, medicine. They are real matter but are also the product of a dematerialization in the process of their design - a type of SPIME. This refers to Bruce Sterlings concept.


Spime is a neologism for a currently-theoretical object that can be tracked through space and time throughout the lifetime of the object. These future manufactured objects with informational support so extensive and rich that they are regarded as material instantiations of an immaterial system.

Nano promises the capability to build the hard stuff one atom at a time and therefore represents an informational dematerialization of manufacturing. The new physics may even enable an 'alchemy' that dematerializes one element into another element.

It could be argued that the nano-bio technology convergences are opening a new vista onto an infinite dimension of matter - where we can create the hard stuff in new ways (including an eternal capacity to recycle the old into the new). Looking at it in another way, life already does this when it transforms sunlight into new hard stuff - matter that did not exist on the earth before hand. As long as sunlight continues to hit the earth one could argue that there is an infinite source of new matter (for all practical purposes). So as we learn to capture this energy (and other forms of ubiquitous energy) with more and more efficiency/effectiveness - the important question becomes "What is the 'limit to the growth of our ability to capture and transform energy?' Is this only the limit 100% of the sunlight that hits earth or are there even more forms of energy (e.g. dark energy)? And how far away is that? And once there, what new forms of dematerialization will become revealed?
I think the view of a finite world is reasonable in a practical way. But if we really accept this, it implies the earth as a closed system with only what is given. While the oil in the ground is certainly finite presently it was not here originally - it is the product of sunlight and the ingenuity of living systems - in some ways oil is the materialization of a previously dematerialized situation. This means the earth is not a 'closed system' with a pre-given finitude. The process of complex 'bubbling forth is one where 'virtual' unmaterialized adjacent possibles were made manifest into matter.

Perhaps this is thinking to far ahead - but the Santa Fe Institute did produce a study which looked at at least four major technology areas - and the rate of progress was not exponential - rather it was super-exponential. So the far off future may make it's appearance sooner than can be imagined.

Having said all this - I of course agree that we must continue to become more vigilant in how we use our resources so that we don't use what is currently available faster than we can sustainably generate more hard stuff. :)
So my friend Amanda suggest that this is what Einstein meant when he said we honour the servant and forget the gift. Wigner said simlar things when he explained that there was no algorithm for finding principles: "The irrelevancy of so many circumstances which could play a role in the phenomenon observed has also been called an invariance. However, this invariance...cannot be formulated as a general principle. The exploration of the conditions which do, and which do not, influence a phenomenon is part of the early experimental exploration of a field. It is the skill and ingenuity of the experimenter which show him phenomena which depend on a relatively narrow set of relatively easily realizable and reproducible conditions." Kauffmanesque ;)
Amanda concludes saying that the phenomena existing depending on such conditions would have their dynamism in the plane of mechanism and everything else we would, today, call "complexity."

I think this is e
xcellent. Mathematics no more explains reality than the grammar of English language explains human behavior.

The sentence "That person walked to the chair and sat down" describes the behavior just as mathematics can describe the behavior of an apple falling from the tree. Both descriptions seems to be very accurate. But neither 'explains' the real underlying complex of causal logic by which both behaviors happen. In this way the map is not the territory and the menu is not the meal.

Both descriptions also frame the perceptions of only some dimensions of all the inter-depending aspects of the many causal chains, including temporal scales. In this framing we demarcate patterns/outlines of some aspects as static/solid and others as dynamic/fluid. 
Amanda building on this replies "Thus with respect to mathematical modelling of nature, the plane to which it applies has been defined and cordoned off by the boundary conditions, which Kauffman says are ingeniously "placed by hand" in both experiments and machine design. Whether that makes the effectiveness of mathematics less uncanny, I don't know. It does mean that this method does not apply that well to a broader world where things slip, slither and slide into the adjacent possible."
I love this vision that we too can slip, slither and slide into a differently perhaps even infinitely (for all practical purposes) resourced world as we exceed the boundaries that seem solid in the mathematical calculus of trajectories and harness those energic-material unknowns of manifest-abilities. :) 

Denis is 
familiar with this Santa Fe study Superexponential Long-term Trends in Information Technology. But argues that the world is far from just IT, and even in IT, the parameters that have been tracked, while important, are far from telling the story. Big IT projects have a horrendous failure rate that increases sharply with size. It is never because of these parameters but projects crash for being unable / not taking the time to understand/deal with the plexus of reality.
He notes that "It is critically important, I think, to realize than civilization is now engaged in a **race** (positive feed-back) condition, between, on the one hand, vastly improving technologies (at least from first order observations) and on the other hand from the exploding complexity that such advances are themselves inducing."
However as he continues there is no hard, applicable metrics for complexity (it is a wicked problem). But assuming there was, I would be surprised if if was not on a similar climb as to that of technology. Also exponential, super-exponential perhaps in certains facets. So here we are, focusing on the hard, happy, measures but very much unable to assess (and largely ignoring) the other side of the mountain (this would be a great topic to dig up).
Now, let's envision the future dynamics of this racing process, I strongly beleive that it is appropriate to invoque a concept very much akin to Blackman's Law of limiting factors). The limiting factor here is man's impotence in understanding / making sense of complexity: being unable to steer through the maelstrom with good sense. So, yes, technologies, taken one at at time, are accelerating. But the net societal result does not necessarily follows. In reality, there are studies that suggest that the
pace of innovation change is stagnating certainly not on a "double exponential".
 
In many ways I am feeling increasingly "entangled" in the quantum sense with a number my Noospheric friends. I very much agree with Denin and have been conceiving our human-in-this-world situation very much as a race. In some ways this is why I argue that the proper types of investment we should be making, related to climate change is in carbon capture technologies rather than solely or primarily in carbon reduction technologies (carbon capture would involve domesticated bacteria to capture carbon to produce fuel and oil among many other things).

I also agree with Denis' observations of the domains of  IT, but my point about it is that the increasing power of IT represents the increasing power of the tools we apply to other domains - which means these other domains also increase in the type of enablement that computational & network capacity is able to produce.

I too agree on the limiting factor being our impotence to comprehend the complex whole. But one can argue that the lack of comprehension is much less of a limiting factor than we might assume. After all - evolution itself is not (or at least one can argue that it isn't) conscious, that it doesn't and needn't understand/make sense of its own unfolding. The brilliance of the technology of evolution (technology as mechanism of selection), is it capacity to continue to adapt. The inability - the impotence of comprehension could be seen as an implicit position of the need to control (rather than an evolutionary capacity to improvise). I would think that all of human existence is a demonstration of how we learn to flourish without understanding the whole.

This is why I believe that the 'cautionary principle' is completely inadequate to our problems. This principle is fundamentally biases against the new - the novel and is blind to the risk of doing nothing by assuming the status quo is safer than change/experiment.

I think the best principle for this race we are in is the 'Vigilance Principle' (noted by Steward Brand in his latest book as well as in Kevin Kelly's new book). The Vigilance Principle is essential to agility focusing on continueal experiment and the capacity to act, respond, adjust, adapt, learn, improve, etc.

As far as our current human limit on understanding/making sense? I think the mind is exterior to the brain, that our tools are the mind's prosthetic extensions. The emerging 'Noosphere' is such an enhancement to the collective human mind. We became cyborgs the moment we created language, (perhaps the moment we created tools), but with the creation of language and culture we became programmable and thus cybernetic. Our reach always exceeds our grasp (else what is a Meta-phor) :) Because as Kauffman has recently made very clear - evolution - at least is future movement, is beyond 'reason's grasp'. It has and will always be so - we can only be vigilant and adaptively agile. :)

Denis also has 
misgivings about the Cautionary Principle - especially as it is often envisioned with an attitude that borders on ideology. As he notes, I am very much in tune with being a vigilant pragmatist, a blend of ultimate optimism with residual skepticism. There is no steady-state in Life and there are possibilities like there has never been before. But.... I observe that for a increasing fraction of the world, there is a growing bandwidth mismatch between the (accelerating) pace of change (much of it from technology) and the traditional dynamics of human adaptation. These are closely linked to culture and education, and evolve slowly. Ashby's Law is getting less satisfied, to the point that we are beginning to see evidence that the mismatch is inducing cases of break-down / backlash.

I completely agree with Denis' observations. The rise of fundamentalism is a backlash against accelerating change that represents ever increasing 'encounters-with-unknown-others'. Not just technologies, but ideas and actual people as urbanization also accelerates. We are becoming a global village, but it is not the village of yore. It is more like the global metropolis - full of unknown peoples, things, values, situations, choices, liberties.... and technologies.

Cultural change is very much the great barrier to the embrace of requisite diversity in all of its splendor and multifarious forms.

However, I like to think of optimism  as the most pragmatic stance a human can take - simply because of what we know about the placebo. Pragmatic optimism - through the power of the placebo optimizes our chances of solutioning.