Tuesday, June 3, 2008

What comes first: the chicken or the egg? Pattern Formation Models in Biology, Music and Design.

by Katerina Tryfonidou & Dimitris Gourdoukis

The popular saying that wonders if the egg is coming before the chicken or vice versa, implies a vicious circle where all the elements are known to us and the one is just succeeding the other in a totally predictable way. In this article we will argue, using arguments from fields as diverse as experimental music and molecular biology, that development in architecture, with the help of computation, can escape such a repetitive motif. On the contrary, by employing stochastic processes and systems of self organization each new step can be a step into the unknown where predictability gives its place to unpredictability and controlled randomness.

01. Music

The Greek music composer and architect Iannis Xenakis in his book Formalized Music [1] divides his works -or better the methods employed in order to produce his works- into two main categories: deterministic and indeterministic models. The two categories, deriving apparently from the mathematics, are referring to the involvement or not of randomness in the compositional process. As Xenakis himself explains, “in determinism the same cause always has the same effect. There’s no deviation, no exception. The opposite of this is that the effect is always different, the chain never repeats itself. In this case we reach absolute chance – that is, indeterminism[2]. In other words a deterministic model does not include randomness and therefore it will always produce the same output for a given starting condition. Differential equations for example tend to be deterministic. On the other hand, indeterministic or stochastic processes involve randomness, and therefore will produce different outputs each time that the process is repeated, given the same starting condition. Brownian motion and marcov chains are some examples of such stochastic mathematical models. Xenakis’ compositional inventory includes processes from both categories[3].

As said above, the use of stochastic models in composition has as a result a process that produces a different outcome each time that it is repeated. For example, using Brownian motion[4] (see figure 01: particles generated using Brownian motion[5]) in order to create the glissandi of the strings, means that the glissandi are generated through a process that includes randomness, therefore if we try to generate them again we will get a different output. At the same time, all the different results of the process will share some common characteristics. With that in mind someone would expect that such a musical composition would vary –at least in some aspects- each time that it is performed. However that is not the case with Xenakis’ works. While he was employing stochastic processes for the generation of several parts of his scores, he was always “translating” his compositions using conventional musical notation, with such detail that he was leaving no space at all for the performer to improvise, or to approach the composition in a different way. In other words the generation of the score involves randomness to a great extent, but the score becomes finalized by the composer so that each time that it is performed it remains the same.

Figure 01: Brownian motion. Object-e architecture: space_sound. 2007.


What is maybe even more interesting is that Xenakis did compose scores that are different each time that they are performed. However, those scores usually are employing deterministic mathematical models, therefore models that do not include randomness. In those cases the situation is inverted: the generation of the score is deterministic, but the performance may vary.

An example of the last case is Duel, a composition that is based on game theory[6]. The composition is performed by two orchestras guided by two conductors, and is literary a game between the two that in the end yields a winner. Each conductor has to select for each move, one out of seven options that are predefined by the composer. A specific scoring system is established and the score of each orchestra depends on the choices of the two conductors[7]. The result of this process is that each time that the composition is performed, the outcome is different. Therefore, a deterministic system where there are seven specific and predefined elements is producing a result that varies in each performance of the score. To make things even more complicated, the seven predefined musical elements are composed by Xenakis with the use of stochastic processes. To summarize the structure of Duel: Xenakis generated seven different pieces using stochastic processes, therefore seven pieces that include randomness. However those pieces were finalized by the composer into a specific form. Then those pieces are given to the conductors that are free to choose one for each move of the performance. The choice of each conductor however is not random: “… it is [not] a case of improvised music, ‘aleatory’, to which I am absolutely opposed, for it represents among other things the total surrender of the composer. The Musical Game accords a certain liberty of choice to the two conductors but not to the instrumentalists; but this liberty is guided by the constraints of the Rules of the Game, and which must permit the music notated by the score to open out in almost unlimited multiplication.[8] So the choices of each conductor are based upon the strategy that he follows in order to win the game, and consequently upon the choices of the second conductor. Therefore the final performance of the score is different each time.

Xenakis was quite specific with his decisions regarding the use of deterministic or indeterministic processes. In most cases he employs models from both categories for each composition. More importantly, while he names his music “stochastic music”, the stochastic part, the place where randomness occurs, is always internal to the process and totally controlled by the composer. The final result - the score that reaches the performer, or even more the listener – is always specific. Even in the cases where the outcome may vary, it still does so in a set of predefined solutions already predicted by the composer.

02. Life Science

The study of morphogenesis, one of the most complex and amazing research topics of modern biology, aims to provide understanding on the processes that control the organized spatial distribution of cells during the embryonic development of an organism. In other words, how starting from a single fertilized egg, through cell division and specialization, the overall structure of the body anatomy is formed[9]. According to Deutsch and Dorman[10] there is a continuum of different approaches to the problem; on the one end there are theories of preformation and on the other end systems of self organization. The concept of preformation assumes that any form is preformed and static. Therefore any new form is always a result of a combination of the already existing forms. Taking a different approach, self –organization implies a de novo pattern formation that is dynamic and gets developed over time. Morphogenesis in the self-organization model depends on the interaction between the initial cells or units. Preformation is a top-to-bottom idea, while self-orgazination is a bottom-up system. In both cases, research uses computation as the necessary medium for the simulation of the biological processes. We will argue that according to the latest research in biology, morphogenesis can be approached as a process which involves both the notion of preformation as well as self-organization.

During morphogenesis, cells proliferate and specialize, i.e. they choose which subset of proteins to express. But how do cells know when to divide or where to specialize? How do cells know where to become skin or bone or how many fingers should they form? It turns out that the key to understanding morphogenesis is the way cells sense and respond to their environment.

Cells obtain information about their environment by using proteins embedded in their membrane to sense specific “message” proteins that are located around them. When such a “message” protein binds to a membrane protein, the cell “receives” the message and acts accordingly[11]. Therefore, during morphogenesis there is a constant interaction, through such “message” proteins, between each cell and its neighboring cells. This interaction helps cells to understand where in the body they are located, when should they divide and when do they need to specialize into some particular type of cell.

The above function, or better, sequence of functions, has been the focus of scientific research for the past two decades. Nowadays, molecular biology can accurately describe many steps of the reactions between proteins and how they are related to cell specialization[12]. It has been proved that these reactions follow specific physical laws which can be described by mathematical models. For example, given a pair of proteins, the properties of the resulting interactions are known. Because of the physical laws that are being applied, the model of the function of cells is a deterministic one, since it is made of many elementary interactions between proteins that have well defined inputs and outputs. Going this notion a step further, one could argue that the function of the cells implies the idea of preformation, that is, from two predefined elements only one possible combination can occur. In a way, the deterministic rules that the reactions of the proteins follow can be seen as a model of preformation, where there is only one output given a specific input.

Although the reactions between the proteins inside the cell follow specific rules that have been defined, yet there is a great degree of unpredictability at the life and function of each cell. Why science cannot predict exactly which will be the next “moves” of the cells, thus controlling all the functions in a (human) body? Although the nature of the outcome of the interaction between two proteins has been studied and analyzed, it is not possible to define deterministically when and where this interaction will take place, since proteins move randomly in space and they can interact only if they come into proximity and under proper relative orientation. This is true for proteins inside and outside the cell. Furthermore, it is not possible to define the exact location of neighboring interacting cells, when each cell will sense the presence of a “message” protein, when and how much will the cell respond to this signal by secreting its own “message” proteins, and when its neighbors will sense this signal. Given the number and complexity of the functions in each cell, as well as the vast possibilities of interactions with its neighboring cells, the large number of processes that could potentially happen cannot be expressed by deterministic models.

Since there is, to a certain degree, randomness in cellular functions, science turned to stochastic models in order to explain them. That is, instead of deterministic mathematical models, scientists use models that incorporate probabilities, in order to include the large amount of possible actions. Brownian motion, for example, is the stochastic model that describes the movement of particles in fluids, and therefore is used to describe the movement of proteins inside the cell. Stochastic processes can describe the spatial and temporal distribution of interactions inside cells and between neighboring cells.

To understand the importance of the stochastic component of cell function, here is another example: even though monozygotic twins have exactly the same DNA, they look similar but not identical. If the cell-response system was purely deterministic, then babies that have the same DNA should look identical. Nevertheless, this kind of twins look very much alike, but they are not identical. The small differences in their physical appearance occurred because of the stochastic nature of protein motion and interaction during morphogenesis. Even though the initial information was the same, and even though the outcome of protein reactions follows deterministic rules, the exact location of cells and proteins can only be described in a stochastic mode.

The stochastic part of cellular functions could, at a different framework, be seen as a model of self-organization. For people outside of the scientific biological community the introduction of randomness at research seems particularly intriguing. Instead of a process of preformation, (specific aspects of cells function that can be described by deterministic models) in the self-organizational model, cell function results in something different that cannot be described by deterministic rules. Cell functions depend on the fact that each cell is part of a whole. Together with their neighbor- cells, they react to external stimuli and to a large extent define the characteristics of the whole, as part of a bottom-up process. Deutch and Dorman, focus on the absence of distinction between organizer and organized in self-organized systems: “In self-organized systems there is no dichotomy between the organizer and the organized. Such systems can be characterized by antagonistic competition between interaction and instability[13]. To a certain extend, the cell functions acquire a self-organizational character, because the transformations depend on the interaction of the cells with each other.

There are many examples like the above in molecular biology to make the point that both deterministic and stochastic processes are used to describe the phenomena of life. As an overall assumption, many of the microscopic phenomena in life science follows deterministic rules, but the macroscopic outcomes can be described only in a stochastic way. Following this thought we argue that models of preformation and self-organization, as described above, can exist simultaneously in a system. The case of the cell-cell interaction in general and in the morphogenesis in particular depicts the complex processes that occur and highlights which part of the processes suggest a deterministic, preformatted model, and part of it follows a stochastic model of self-organization.

03. Design

The two cases we already examined – Xenakis work in musical composition and the study of morphogenesis in molecular biology – are both dependant to a great extent on the same medium: computation. Xenakis used computer as the means to transform mathematical models into music almost from the beginning of his career. At the same time, it would be impossible for researchers today to study the extremely complex phenomena that are involved in the development of life without the use of the computer.

The use of the computer, of course, has also become one of the main driving forces behind design today. The encounter of computation with design happened rather late and in the beginning took the form of an exploration of the formal possibilities that software packages were offering. That initial –maybe “immature” but still experimental– approach soon gave its place to a widely generalized dominance of digital means on every aspect related to architecture: from design strategies to the construction industry.

However, using the computer in an architectural context does not necessarily mean that we are taking advantage of the opportunities and the power that computation has to offer. More often than not, the use of computers in architecture today serves the purpose of the “computerization” of already predefined processes and practices – aiming usually to render them more efficient or less time consuming. That might be convenient, it does not promote however the invention of new ways to think about architecture. As Kostas Terzidis notes, “computerization is the act of entering, processing or storing information in a computer… [while] … computation is about the exploration of indeterminate, vague, unclear and often ill-defined processes[14]. While automating and mechanizing every-day architectural tasks may be useful, the true gain for architecture in relation to digital media lies in the understanding of what a computational design process can really be. Only this way we can use computers in order to explore the ‘unknown’; in order to invent new architectures. We believe that the already mentioned examples in biology and the music of Xenakis can offer the means to better understand where these creative possibilities of computation are laying. Of course computation has numerous applications in several different fields, the selection of those two specific cases as guidelines however, has a very specific motivation: The biological approach is providing scientific, highly developed techniques that have been tested thoroughly and at the same time can show us how computation and digital tools can become the bridge between complex processes taking place in the physical world and the way that space is created. Xenakis’ work on the other hand, is an example of computational techniques used outside their strict scientific origins; that is in order to compose a musical score. Therefore they can provide insights about how those methods can be used in order to create an art-form.

The work of Xenakis is pointing out one of the most import aspects that computation is bringing into (architectural or musical) composition: the introduction of randomness. One can argue of course that architects always had to take decision based on chance. However humans are not really able of creating something totally random. For example if we ask somebody to draw a random line; between the decision to draw a line and the action of drawing the line, there are several different layers that affect the result: ones idea of what a line is, ones’ idea of what random is, the interpretation of the phrase “draw a random line” etc. On the contrary computers are very good at producing randomness (see figure 02: stochastic algorithm positioning and scaling a box randomly). If we program a computer to draw a random line, then the computer will simply draw a line without anything external interfering between the command and the action. The ability to produce randomness, combined with the ability to perform complex calculations is defining the power of computational stochastic processes. And as we have already seen with the work of Xenakis, randomness can be controlled. Therefore the architect/programmer can specify specific rules or define the range within which the stochastic process will take place and then the computer will execute the process and produce results that satisfy the initial conditions. The advantage of such a process lays in the fact that through randomness the architect can be detached from any preconceptions that he or she may have about what the result should be, and therefore it will become easier to generate solutions initially unpredicted. By defining the rules and letting the computer generate the results, we open ourselves to a field of almost endless possibilities; designers can produce results that they couldn’t even imagine in the beginning of the process, while they can still maintain control of the process and the criteria that should be met.

Figure 02: Stochastic distribution. Object-e architecture: space_sound. 2007.


The most important aspect that is crucial to realize, and Xenakis’ work is making it easier to see, is that the power of algorithms used in composition lays in the process and not in the final result. If it is a building that we need to produce, then a building might be produced in the end, as a musical composition was produced in the end by Xenakis. However the importance is moving from the final result to the process that we use to generate it; the architect is not designing the building anymore, but the process that generates it. To point it out once again: computation is defining a process, not an output. And exactly because it allows us to focus on the process and not the results, those results can be unexpected.

Figure 03: student: Josie Kressner


While Xenakis emphasizes the importance of process over the final output along with the stochastic properties of algorithms, the example from biology, when applied to architecture, is highlighting another important aspect that the use of algorithms is raising: that of self-organization. Architectural tradition, starting with the renaissance, is heavily based upon the idea of the “master”: an architect with a specific vision, which he or she materializes through his/her designs creating subsequently a style. Self-organization however implies a totally different idea: the architect does not actualize through his design something that he or she has already conceived. On the contrary: the architect creates the rules, specifies the parameters and runs the algorithm; the output is defined indirectly. Through computation and after many iterations, even the simplest rules can provide extremely complex results, which are usually unpredictable. Moreover, by a simple change in the rules something totally different may arise. The top-bottom idea of architecture, with the architect being at the top level and his/her creations being at the bottom, is inverted: the process begins from the bottom. Simple elements interact with each other locally, and through the iterative application of simple rules, complex patterns start to emerge. The architect does not design the object anymore, but is designing the system that will generate the final output.

An example of a pattern formation model with self-organizational properties is that of cellular automata, which are extensively in use in several different fields, and lately also in architecture. A cellular automaton is a self organized system where complex formations arise as a result of the interaction and the relations between the individual elements. The simplicity of the model combined with its abilities to produce very complex results and to simulate a very wide range of different phenomena makes it a very powerful tool that allows the architect to be disengaged from the creation of a specific output and to focus instead on the definition of a process. (see figure 04: a one dimensional ca and a surface generated byt the acceleration of the rules.)

Figure 04: student: Lauren Matrka


The possibilities arising for architecture are virtually infinite: from the creation of self-organized and self-sustained ecosystems to the study and planning of urban growth. In the place of externally imposed “sustainable” rules, we can have internally defined rules that form the generative process. In the place of applying external “planning” strategies to the cities, we can study urban entities as organisms, as system that are growing following specific rules that define the interaction between its elements.

Yet, as noted in the example of the protein interaction, self-organization is not encountered on its own. It is always functioning together with other, deterministic or pre-formed, systems. The same way that Xenakis was using indeterministic processes in relation to deterministic ones.

What stochastic processes and self organization are offering to architecture, are the means to engage the unknown, the unexpected. The means to move away from any preconceptions that define what architecture is or should be, towards processes the explore what architectures can be. As Marcos Novak writes, “by placing an unexpected modifier x next to an entity y that is presumed – but perhaps only presumed – to be known, a creative instability is produced, asking, ‘how y can be x?’[15]. In that sense, by placing models of self organization next to models of preformation, or stochastic processes next to deterministic processes, we are not only inventing new systems or new architectures, but we are also discovering new qualities of the systems that we already know – or we thought that we know.



[1] see Xenakis, I. Formalized Music: Thought and Mathematics in Composition New York: Pendragon Press, 1992.

[2] see Varga, B.A. Conversations with Iannis Xenakis London: Faber and Faber limited, 1996, p.76.

[3] In the “deterministic” category of Xenakis’ work fall compositions like Akrata, Nomos Alpha and Nomos Gamma, while examples of the “indeterministic” approach can be found in compositions like N’Shima (Brownian Motion) and Analologigues (Markov Chains).

[4] Brownian motion in mathematics (also Wiener process) is a continuous-time stochastic process. In physics it is used in order to describe the random movement of particles suspended in a liquid or gas.

[5] Figures 1-2: from the project space_sound, Object-e architecture, 2007. Figures 3-4: student work, School of Architecture, Washington University in St. Louis.

[6] Game theory is a branch of applied mathematics that studies the behavior in strategic situations, where an individual's success in making choices depends on the choices of others.

[7] For a detail description of Duel see Xenakis, I Formalized Music: Thought and Mathematics in Composition New York: Pendragon Press, 1992. p. 113 – 122.

[8] Xenakis, I. Letter to Witold Rowicki. see Matossian, N. Xenakis New York: Taplinger Publishing Co. 1986, pp. 164-165.

[10] see Deutch A. & Dorman S. Cellular Automaton; Modeling of Biological Pattern Formation, Boston: Birkhauser, 2005.

[11] see Sudava D., et al, Life: The science of Biology, Freeman Company Publishers, 2006.

[12] see Lodish H., et al, Molecular cell biology, Freeman Company Publishers, 2007.

[13] see Deutch A. & Dorman S., Cellular Automaton; Modeling of Biological Pattern Formation, Boston: Birkhauser, 2005.

[14] see Terzidis, K. Expressive Form, a Conceptual Approach to Computational Design New York: Spon Press, 2003, p.67.

[15] see Novak, M. “Speciation, Transvergence, Allogenesis: Notes on the Production of the Alien” AD vol 72 No 3 Reflexive Architecture, Spiller N. (ed.) London: Wiley Academy, 2002, p.65.

1 comment:

Anonymous said...

very interesting post. conceptually accurate without useless technical fuss. if you wish to develop the part where indeterminism rules cells, you might want to read Jean-Jacques Kupiec, the origins of individuals, world scientific publishing. french biologist, if you read french, the original text is better (l'origine des individus). an excellent book, highly accessible to non-biologists. thx and keep up.