Socio-technical systems as artificial agents

This third group of artificial agents sits somewhere between organization and technology, as a blend of both. To understand the agency of socio-technical systems, it is helpful to remember that they belong to the complex adaptive systems. These are open systems, their perimeter is permeable. Embedded in a larger environment, they obtain material, information, and energy from the outside, and release their products (and waste) to the outside. These are nonlinear systems, they will not always respond proportionately to an outside stimulus. Rather, a small trigger might have massive irreversible effects (tipping points); or a big stimulus might cause tiny outcomes (diminishing marginal returns). These are also dynamic systems, their inner configuration and outer perimeter can change. They may lose some of their component elements to the environment or adopt new components from there. For complex adaptive systems, each of the multiple components has some level of agency and autonomous decision-making capacity. These agents have many interactions amongst themselves and with their environment, creating any number of feedback loops, which shape the system’s overall characteristics.

Even with complete knowledge of all the components’ capacities and current state —which is practically impossible— the system’s reactions, actions, and behaviours cannot be predicted. Such systems are not predetermined, though they are path-dependent: as observers, we can explain in retrospect what happened, yet we are in no position to foresee or predict what will happen. This seeming paradox was best captured by Danish philosopher Søren Kierkegaard:

Life can only be understood backwards, but it must be lived forwards.

No matter how perplexing, these characteristics are the precondition for such a system’s eponymous capability and most stunning feat: it can adapt. A complex adaptive system can adjust itself to continue its mission even when conditions change. Faced with a new situation, a lack of some important input, or the loss of some of its components, such as system will not immediately freeze into inaction: it can reconfigure itself so that it can keep going. Usually, we do not see the specific adaptations inside the system as it responds to a concrete change of conditions. These adaptations occur deep within, hidden from plain sight: in the interactions between its components, in the feedback loops that can reinforce those interactions.

We can, however, observe changes in the system’s overarching actions, in its macro-behaviour. Those changes expose two additional features that are characteristic for all complex adaptive systems: emergence and resilience.

  • Adaptations may start out temporary —in response to a transient change— but they can become permanent in the face of enduring change. Further changes can trigger additional adaptations; over time, new adaptations can build upon earlier adaptations. The system can thus gradually evolve towards something new: adopting additional components from the environment, acquiring novel features and capacities, or developing novel actions. Such emergence (emerging or emergent behaviour) is often considered the hallmark of complex adaptive systems. Using an anthropocentric image, emergence can be described as a system’s ability to surprise a human observer with an unexpected action, doing something “that it has never done before”.
  • Adaptations are not random, though. They are triggered by changing external conditions, yet they remain focused on the system’s overarching purpose (such as surviving, making profit, or moving data from A to B). This ‘mission’ provides a solid anchor for and focus on such adaptations that are useful (rather than those that are just possible). Unlike inertia (“plough through and stay your course”) or resistance (“dig in and weather the storm”), resilience allows for an elastic response when coping with external shock. However, this ability to bounce back (aptly labeled bouncebackability) is not infinite. It is constrained by the resources available to the system, the capacity of its constituent components, and the speed of the interactions between those. So there are clear limitations to the changes such a system can cope with, the shocks it can absorb. Yet the thresholds to failure are considerably higher for these systems than for mere complex systems.

This conceptual backdrop sets the stage for showcasing the agency of socio-technical systems. Let us look at three situations: how we trade, how we share information, and how we live.

Trading — Since humanity’s early days, we bartered with our neighbours, trading “one sheep for fifteen chickens”. Over time, we developed long-distance trade between strangers and invented money as an abstract measure of economic value. Today we have an intricate thicket of local markets, national economies, regional economic blocks, and global supply chains. Adam Smith’s famous image of the ‘invisible hand’ perfectly conveys the notion of a national economy’s agency: we clearly see the results of its work, but we never see that work itself. Nevertheless, the economy does affect us: by setting prices, it defines the availability of raw material, goods, and services; by setting interest-rates, it orients investments towards the most promising targets (whether countries, commodities, industry sectors, or technologies).

Sharing information — Our craving for sharing gossip is as old as language itself. With the invention of script we kept records, and with the printing press came books, pamphlets, and newspapers. Once we learned about electromagnetic waves, we developed the telegraph, radio and television. Lately, we created the internet that gave us social media and the platform economy, whose advertising-funded micro-targetting ushered in the end of media as we knew them before. The media —newspapers in particular— were once held in high regard as the fourth estate that keeps the government honest and the public informed. Today, we are losing that positive agency, strangely willing to replace it with hate speech, conspiracy theories and echo chambers on social media.

Living — Around the globe, across cultures and over many millennia, our cities evolved as multilayered and unwieldy spaces (often serving as synonym for ‘complexity’). Even though the visible parts of a city, its buildings and infrastructure, take considerable effort, time, and investment to build and maintain, its overall development does not follow any master plan. Rather, any city evolves under a distributed, diverse, layered multitude of ownerships, responsibilities, decisions, and actions. Yet cities are much more than “the substrate that we live on”. Cities structure our everyday lives, essentially defining the location and availability of housing, food, work, recreation, and entertainment. Through their transportation systems, they even shape the encounters we may (or may not) have with each other.

These three examples of socio-technical systems all combine different forms of organizations and technologies. They all exist as a consequence of our and our forebears’ earlier decisions and choices. And they all influence our everyday lives in many direct and indirect ways: their artificial agency wields power over us. We gave it to them: sometimes deliberately, but more often without intention.

This is the fifth in a series of posts on the agency and how it matters to innovation.

What's your view?

This site uses Akismet to reduce spam. Learn how your comment data is processed.