Abstract

This essay is conceptualising models as cultural techniques 1, agential 2 media and designed systems, which catalyse fruitful discussions on serious matters of our commons.The term ‘commons’ means all the natural, technological and cultural resources accessible to all members of a community. This is also relevant for artistic and design-based research as it is argued in this essay. Models are called into action especially when these “matters of concern” 3 transcend our intuitive understanding and reach a degree of complexity 4 that goes beyond simple human reasoning. In such cases, we need help from models. They show us, and let us experience, important aspects of the unforeseeable, emergent, and sometimes global effects – both positive and negative – of our machine’s day-to-day micro-behaviour. Consisting of little actions, they combine to affect our common resources and infrastructures.

Introduction

Models manifest themselves as mechanical machines, analog computers, synthesisers, software simulations, games, animals, bacteria, ecological systems and much more. They are crystallisations of scientific theory in agential matter, which we humans can see, feel and hear. As such they belong not only to the domain of science and technology, but also to art, design, architecture, dance and sound. Models can be artistic or have an elaborate design. On their way from techno-science to art galleries, design exhibitions, workshops and public interventions, they go through a metamorphosis from ideally being functional to being diffractive, sometimes even dysfunctional. Following this thought, the essay addresses a yet-to-be realised program of artistic and design-based research. This strategy intends to enable, catalyse and open up fruitful discussions of the ongoing micro-design 5 of the different modes, rules and habits in the human and non-human ‘mattering’, enactment, organisation and management of common resources, goods, knowledge and infrastructures.

Minor matters

During their endeavour to acquire new knowledge, scientists and engineers usually apply established designs of instruments, tools, models and simulations. Often, they need to develop their own tools of experimentation, but unfortunately rarely have time to reflect and experiment upon their extended aesthetic, artistic and design-related aspects. While accuracy of content abstraction and technical functionality matter, these issues, although recognised, are of secondary importance. They are minor matters and do not appear on scientific research agendas. As the aesthetics of things, objects and processes are key competences of art and design, everything which addresses the human sense-modalities is a matter of our concern. Given the ongoing cost reduction in electronic tinkering, mechanical parts, computation power, robotics and rapid prototyping, as well as the dawn of open soft- and hardware projects, researching matters of complexity with alternative models as agents in creative, but technologically-informed ways is at the edge of becoming an issue of application-oriented fundamental research done at art and design universities.

Practices of modelling are seen both in the field of the arts and in technoscience. Physical models, for instance, are used in both contexts. Nude models, model figures, clay models and sculptures, indeed all sorts of assemblages of body and materials in artistic contexts, are similar to scale models, model cars, globes, atom and molecule models massively used a long time ago in science and engineering. Nowadays, they are only used for pedagogical purposes. The same applies to conceptual models, which are based on feedback circuits. The electronics used in analog computing, a long-forgotten scientific field that proliferated between the 1920s and 1970s, are basically the same circuitry later incorporated in the audio and video synthesisers used by many composers and artists since the 1960s, such as Nam June Paik (1932–2006), Steina and Woody Vasulka (1940, 1937), Jack Burnham (1931) and many more. Others such as John Cage (1912–1992) or Harald Haacke (1924–2004) experimented more conceptually with the idea of feedback.

In technoscience, the aim of modelling is “to represent an idea, concept, or situation, usually in a form that facilitates further analysis. The more malleable and flexible the modelling medium, the more powerful and experimental the modelling” (Care 2010) With the dawn of digital computing in the 1980s, scientific modelling was increasingly done with simulations watched on computer screens. At the same time, artists and designers started to use the computer for their creations. But while the creative still preferred diversity and used not only visual, but also sound-based forms of expression, always showing the hardware of their projects and works, scientists and engineers were more and more using only visual, screen-based media for their modelling. Understanding complexity by simulations applying agent-based modelling, for example, was mostly done via eyes staring at some screen, interacting via keyboard and mouse, later by touch-screen.

The aesthetics of things, objects and processes are minor matters in technoscience. What counts is the content. Therefore the way simulations and modelling is done never changed substantially since the dawn of the PC. There are several reasons for this ongoing tendency towards visual and virtual abstraction 6. One is certainly the above-mentioned malleability of digital media and another the printability of computer graphics, but these can’t get fully discussed here. Instead, I will give a brief general account on a speculative, yet-to-be realised program of artistic and design-based research. A program that will liberate technoscientific models from their aesthetic constraints in order to make them more useful and understandable. Not simpler, but more complicated, affective, disrupting, disputable; more interfering, parasitic and troubling than before.

Common concerns

There are no consistent research programs without some theoretical backing, therefore some considerations are in order. The legitimacy to do research in art and design shall not be questioned. That would be pointless. We build on two assumptions. Firstly, artistic research and experimental design are techniques of emergence 7. Secondly, research practices in science and engineering are similar to design processes (Glanville 1999). Verification of both assumptions occurs only by concrete enactment and unfolding. Focusing on modelling complexity might be an interesting strategy within these conditions, since it affords a coupling to urgent global issues of governing the commons, not only addressed to politicians and policy makers, scientists, engineers and critical thinkers such as sociologists, philosophers or historians, but, as I argue here, also to artists and designers.

Convincing examples of such urgent matters are cooperation dilemmas and issues among users of common pool resources. A common pool resource is a type of asset consisting of natural, cultural or social resources like air, drinking water, fishing grounds, pastures, forests, irrigation systems and generally all energy and nutrition resources; also resources like literature, music, movies, all media products in general and open source software. According to our democratic ideals, these are all resources that are or should be held in common, not owned privately, since they affect all connected forces, parties, agents and humans, regardless of their geopolitically allocated influence factor. Disastrous effects of bad resource management and cooperation dilemmas, such as traffic jams, over-fishing or even climate change, are often results of complex interplay of all involved micro-forces.

Resource sharing is not easy. The “Tragedy of the Commons”, as formulated in 1968 by Garrett J. Hardin (1915–2003) in an article in Science, is one of the most famous depictions of the social problems of common goods sharing. “Picture a pasture open to all. It is to be expected that each herdsman will try to keep as many cattle as possible on the commons. […] Therein is the tragedy. Each man is locked into a system that compels him to increase his herd without limit – in a world that is limited.” (Hardin 1968) This will inevitably lead to an over exploitation and resource depletion.

About twenty years later, Elinor Ostrom (1933–2012) 8 published Governing the Commons (1990) and therein referred to Hardin. Besides picking up similar models such as the prisoner’s dilemma game, she also took up Mancur L. Olson’s (1932–1998) The Logic of Collective Action (1965), where he theorised that members of large groups do not act according to a common interest unless motivated by personal economic or social gain, while small groups can act on shared objectives. Personal gain leads to bad results, if the resource is limited. Obviously most resources are limited in some form.

Ostrom was more optimistic than her precursors and described many real-world cases, where sharing a common resource pool is working. By field research and referring to Game Theory, she famously formulated eight design principles for stable local common pool resource management (Ostrom 1990). Most importantly, operational rules of resource usage would need to get defined by the participants themselves, not from top-down, but bottom-up.

Governing the Commons was highly influential. In the last twenty-five years since 1990, models of common resource sharing were implemented in digital computer simulations; and concepts from cybernetics 9, system dynamics (Castillo and Saysel 2005), chaos theory (Wilson et al. 1994), and even cellular automata theory have informed this still evolving field of research (Berge and van Laerhoven 2011). Already Ostrom herself not only referred to W. Ross Ashby’s (1903–1972) An Introduction to Cybernetics, but to Thomas Schelling’s (1921) Micromotives and Macrobehavior as well as Ilya Prigogine’s (1917–2003) Time, Structure and Fluctuations. 10 Both are discourse-founding Nobel Prize lectures (1978, 1977) for establishing research in emergent, counter-intuitive group behaviour, nonlinear dynamics, self-organisation and research on complex systems.

Modelling the complexity of micro-actions within the field of the commons is promising. These and other connections allow innovative links to alternative modes of modelling beyond digital simulations, such as analog computing, feedback circuitry and hybrid models. More recent research on the commons combined with computer gaming is equally promising. Since 2015, after the rise of serious gaming (Schuller et al. 2013), tragedy of commons games are playable online. Still the steps to a recursive application back to common pool resource’s users in order to inform them are rare, while the importance of self-organisation and self-governance has already been formulated (Castillo and Saysel 2005) . A conventional design task would then be to facilitate and enable such processes of self-organisation by improving aspects of visual communication, product or interaction design. Victor Papanek (1923–1998), since the 1970s, was one of the first to combine ecological thinking with product design. More recent emerging research fields are participatory design, design in the context of citizen science, eco-design, sustainability and transformative design. Such projects hopefully provoke interest from institutions such as the Centre for Policy Modelling in Manchester (Edmonds and Gershenson 2015). A slightly more radical approach would claim that merely playing a prisoner’s dilemma or a tragedy of commons game is not enough, since making, programming and designing such a game involves much more learning and is therefore much more effective.

Diffractive modelling

Design work and aesthetic experimentation with the communicative affordances models within fields such as offered by the commons, unfolds firstly via contact with policy makers and managers working in organisations acting in these fields, and secondly via linking more directly with the involved actors, users and workers. The experience of complex matters via artist and design-based sense-making with models and interaction is surely insightful. An emphasis on enabling not just a reflective, but a diffractive understanding 11 about common pool resources might prove to create stronger impacts. This is the speculation this short essay builds upon.

According to Karen Barad (1956) and Donna Haraway (1944) diffraction, in comparison to reflection, which is the common term used in conjunction with critical inquiry or critical thinking, is not merely about mirroring without influence. It is also about positive interfering, blurring, bending and transforming with the content under study (Haraway 1992). It is a spurious différance (Bernasconi and Wood 1988) or a smeary differentiation. When you drop two stones in a pond they generate ripples, interferences and interweaving on the water surface. Similarly, fields as diverse as the arts and technoscience could positively interfere with each other, still maintaining their specificity and characteristics. A diffractive inquiry both transforms and bends its subject in order to create a range of alternative approaches for its study, but also tries to keep high-fidelity concerning its sources.

Diffractive modelling designs communication between the matter to understand, the model and its user in a highly flexible, if not an agential manner. Agential is a term I borrowed again from Barad. Quantum physics showed us that “theoretical concepts are defined by the circumstance required for their measurement” (Barad 1998). This means, “that there is no unambiguous way to differentiate between the ‘object’ and the ‘agencies of observation’” (Barad 1998). Not only is the user (inter)acting with the model, the model is acting back and becomes an agent. Model, user and creator are all, as agents, coupled to each other like in a ménage à trots. As Henri Poincaré (1854–1912) showed, three interacting bodies generate nonlinear dynamics (Barrow-Green 1996). A minor change in the condition of one of those three is agential upon the remaining two. Furthermore, as in quantum physics, where a particle can become a wave and vice versa (Barad 1998) , diffractive modelling is never static. Ideally, it is unfailingly experimental, slippery, strange and peculiar; not sticking to one version of modelling, but constantly researching new forms of representation, aestheticisation and sense-making. It therefore demands a lot of effort.

To be more concrete concerning different techniques of modelling, a first step would be to broaden the current aesthetic qualities of simulation and modelling by transgressing again to physical space and real-world processes – as was done in the past with physical models, electrical equivalents (analog computing) and mechanical types of models. Furthermore, by combining digital computation with electronics and real-world actuators such as motors, electromagnets, hydraulic, optical or acoustic systems and more advanced sorts of transducers – in other words, by combining hardware with software or by carrying out physical computing – positive diffractions of current modelling and simulation practices could emerge. Neighbouring fields such as interface/interaction design and research on the so-called Internet of Things at institutions such as the MIT Media Lab, Royal College of Art, ETH Zurich and many more are offering thousands of starting points to unfold diffractive modelling.

It was not by accident that for a long time interactive modelling was regarded as the domain of analog computing, and information processing and calculation as the domain of digital computation (Care 2010) . Analog systems operate in real-time. There is no symbolical translation of the matter in action. No data involved. That was its specificity, but the acceleration of digital processing made it redundant. Historical ignorance is obviously no option. Digital simulation shall thus not get abandoned, but extended with analog computing: Agent-based modelling not only as virtual simulation, but more as some sort of tangible real-world happening, not fully out of, but a little bit under control.

The history of analog computing affords a whole ecosystem of strange apparatuses, peculiar assemblages and unheard-of models. Soap bubbles were used in aerospace engineering for obtaining mathematical solutions of the so-called Laplace equation. Their surface was an analogy for mathematical principles (Care 2010). Hydraulic flow systems were used to model the national economy of the United Kingdom (Care 2010). Electrolytic tanks were used broadly for oil reservoir modelling (Care 2010) or so-called rotating dishpan models for chaotic fluid dynamics (Care 2010) . Could such strange apparatuses become diffractive models for experiencing and understanding current matters of concern? In the early 1960s, British cybernetician Stafford Beer (1926–2002) was experimenting with organisms such as leeches in an artificial pond, which would model a whole economy. He did not succeed (Pickering 2011), but in a recent scientific field called unconventional computing, leeches were used as models of the behaviour of humans fleeing buildings (Adamatzky and h. Sirakoulis 2015) .

Diffractive modelling is not meant to become fully useful in its straightforward meaning. It is not about utilising creativity in order to solve global issues. It is more about addressing them by interactive involvement via modelling and experiences, which provoke new alternatives of established practices. Diffractive practices need to stay vague and experimental, in order to enable new modes of coupling. At the same time, it is important to keep it down-to-earth and establish high-fidelity with the sources of the interference. How to talk about serious matters of complexity with models as agents is not an answer, but a question, which should indeed be the main driving force of such a difficult endeavour.

  1. Adamatzky, Andrew, and Georgios Ch h. Sirakoulis. 2015. “Building Exploration with Leeches Hirudo Verbana.” Biosystems 134: 48–55. https://doi.org/10.1016/j.biosystems.2015.06.004.
  2. Ashby, W. Ross. 1956. An Introduction to Cybernetics. John Wiley & Sons.
  3. Barad, Karen. 1998. “Getting Real: Technoscientific Practices and the Materialization of Reality.” Differences: A Journal of Feminist Cultural Studies 10: 87–128.
  4. Barrow-Green, June. 1996. Poincaré and the Three Body Problem. Oxford University Press.
  5. Berge, Erling, and Frank van Laerhoven. 2011. “Governing the Commons for Two Decades: A Complex Story.” International Journal of the Commons 5 (2): 160–87.
  6. Bernasconi, Robert, and David C. Wood. 1988. Derrida and Différance. IL.
  7. Care, Charles. 2010. Technology for Modelling: Electrical Analogies, Engineering Practice, and the Development of Analogue Computing. Springer.
  8. Castillo, Daniel, and Ali Kerem Saysel. 2005. “Simulation of Common Pool Resource Field Experiments: a Behavioral Model of Collective Action.” Ecological Economics 55 (3): 420–36.
  9. Edmonds, Bruce, and Carlos Gershenson. 2015. “Modelling Complexity for Policy: Opportunities and Challenges.” In Handbook on Complexity and Public Policy, edited by Robert Geyer and Paul Cairney, 205.
  10. Ernst, Wolfgang. 2013. “From Media History to Zeitkritik.” Theory, Culture & Society 30 (6): 132–46.
  11. Glanville, Ranulph. 1999. “Researching Design and Designing Research.” Design Issues 2 (15): 80–91.
  12. Haraway, Donna. 1992. “The Promises of Monsters: A Regenerative Politics for Inappropriate/d Others.” In Cultural Studies, edited by Lawrence Grossberg, Cary Nelson, and Paula Treichler, 295–337. na.
  13. Hardin, Garrett. 1968. “The Tragedy of the Commons.” Managing the Commons 162 (3850): 1243–48. https://doi.org/10.1126.
  14. Hayles, Katherine N. 1999. How We Became Posthuman : Virtual Bodies in Cybernetics, Literature, and Informatics. University of Chicago Press.
  15. Johnston, John. 2008. The Allure of Machinic Life : Cybernetics, Artificial Life, and the New AI. MIT Press.
  16. Latour, Bruno. 1986. “Visualization and Cognition.” Knowledge and Society Studies in the Sociology of Culture Past and Present 6: 1–40.
  17. ———. 2004. “Why Has Critique Run out of Steam? From Matters of Fact to Matters of Concern.” Critical Inquiry 30 (2): 225–48.
  18. Latour, Bruno, and Steve Woolgar. 1986. Laboratory Life : the Construction of Scientific Facts. Princeton University Press.
  19. Manning, Erin, and Brian Massumi. 2014. Thought in the Act: Passages in the Ecology of Experience. University of Minnesota Press.
  20. Mindell, David A. 2002. Between Human and Machine: Feedback, Control, and Computing before Cybernetics. Johns Hopkins University Press.
  21. Mitchell, Melanie. 2009. Complexity : a Guided Tour. Oxford University Press.
  22. Ostrom, Elinor. 1990. Governing the Commons : the Evolution of Institutions for Collective Action. Cambridge University Press.
  23. Parikka, Jussi. 2011. “Operative Media Archaeology: Wolfgang Ernst’s Materialist Media Diagrammatics.” Theory, Culture & Society 28 (5): 52–74.
  24. ———. 2013. “Afterword: Cultural Techniques and Media Studies.” Theory, Culture & Society 30 (6): 147–59.
  25. Pickering, Andrew. 2011. The Cybernetic Brain : Sketches of Another Future. University of Chicago Press.
  26. Schuller, Bjorn W., Ian Dunwell, Felix Weninger, and Lucas Paletta. 2013. “Serious Gaming for Behavior Change: The State of Play.” IEEE Pervasive Computing, no. 3: 48–55.
  27. Turner, Fred. 2006. From Counterculture to Cyberculture : Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. University of Chicago Press.
  28. Wiener, Norbert. 1948. Cybernetics: Or Control and Communication in the Animal and the Machine. MIT Press.
  29. Wilson, James A., James M. Acheson, Mark Metcalfe, and Peter Kleban. 1994. “Chaos, Complexity and Community Management of Fisheries.” Marine Policy 18 (4): 291–305.
  1. See Afterword: Cultural Techniques and Media Studies (Parikka 2013)

  2. See Getting Real: Technoscientific Practices and the Materialisation of Reality (Barad 1998)

  3. See Why Has Critique Run out of Steam? From Matters of Fact to Matters of Concern (Latour 2004)

  4. I define complexity as a phenomenon linked to a system, network, collection or assemblage with many parts where those parts interact with each other in multiple ways, so that they generate effects that are unforeseen, not in order (predictable), nor totally random, but in-between these states. See (Mitchell 2009)

  5. This notion is strongly influenced by Wolfgang Ernst’s notion of micro-time and time-criticality. See (Ernst 2013) and (Parikka 2011)

  6. Early work by Bruno Latour are pertinent for this question, see (Latour 1986) and (Latour and Woolgar 1986)

  7. “Experimental practice embodies technique toward catalysing an event of emergence whose exact lineament cannot be foreseen. […] Technique is therefore processual: it reinvents itself in the evolution of a practice. […] This idea of research-creation as embodying techniques of emergence takes it seriously that a creative art or design practice launches concepts in-the-making.“ (Manning and Massumi 2014)

  8. In 2009 Ostrom shared the Nobel Prize in Economics with Oliver E. Williamson for her analysis of common resource pool governance. 

  9. For those who are inclined to ask for references, here a short, but pertinent list: Cybernetics or Control and Communication in the Animal and the Machine (Wiener 1948); An Introduction to Cybernetics (Ashby 1956)); How We Became Postman: Virtual Bodies in Cybernetics, Literature, and Informatics (Hayles 1999); Between Human and Machine: Feedback, Control, and Computing before Cybernetics (Mindell 2002); From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism (Turner 2006); The Allure of Mechanic Life. Cybernetics, Artificial Life, and the New AI (Johnston 2008); The Cybernetic Brain: Sketches of Another Future (Pickering 2011)

  10. See bibliography in Governing the Commons (Ostrom 1990)

  11. Thanks to my friend and current work colleague Jamie C. Allen for hinting me to this important vein of post-feminist materialist theory.