Tuesday, June 3, 2008

What comes first: the chicken or the egg? Pattern Formation Models in Biology, Music and Design.

by Katerina Tryfonidou & Dimitris Gourdoukis

The popular saying that wonders if the egg is coming before the chicken or vice versa, implies a vicious circle where all the elements are known to us and the one is just succeeding the other in a totally predictable way. In this article we will argue, using arguments from fields as diverse as experimental music and molecular biology, that development in architecture, with the help of computation, can escape such a repetitive motif. On the contrary, by employing stochastic processes and systems of self organization each new step can be a step into the unknown where predictability gives its place to unpredictability and controlled randomness.

01. Music

The Greek music composer and architect Iannis Xenakis in his book Formalized Music [1] divides his works -or better the methods employed in order to produce his works- into two main categories: deterministic and indeterministic models. The two categories, deriving apparently from the mathematics, are referring to the involvement or not of randomness in the compositional process. As Xenakis himself explains, “in determinism the same cause always has the same effect. There’s no deviation, no exception. The opposite of this is that the effect is always different, the chain never repeats itself. In this case we reach absolute chance – that is, indeterminism[2]. In other words a deterministic model does not include randomness and therefore it will always produce the same output for a given starting condition. Differential equations for example tend to be deterministic. On the other hand, indeterministic or stochastic processes involve randomness, and therefore will produce different outputs each time that the process is repeated, given the same starting condition. Brownian motion and marcov chains are some examples of such stochastic mathematical models. Xenakis’ compositional inventory includes processes from both categories[3].

As said above, the use of stochastic models in composition has as a result a process that produces a different outcome each time that it is repeated. For example, using Brownian motion[4] (see figure 01: particles generated using Brownian motion[5]) in order to create the glissandi of the strings, means that the glissandi are generated through a process that includes randomness, therefore if we try to generate them again we will get a different output. At the same time, all the different results of the process will share some common characteristics. With that in mind someone would expect that such a musical composition would vary –at least in some aspects- each time that it is performed. However that is not the case with Xenakis’ works. While he was employing stochastic processes for the generation of several parts of his scores, he was always “translating” his compositions using conventional musical notation, with such detail that he was leaving no space at all for the performer to improvise, or to approach the composition in a different way. In other words the generation of the score involves randomness to a great extent, but the score becomes finalized by the composer so that each time that it is performed it remains the same.

Figure 01: Brownian motion. Object-e architecture: space_sound. 2007.


What is maybe even more interesting is that Xenakis did compose scores that are different each time that they are performed. However, those scores usually are employing deterministic mathematical models, therefore models that do not include randomness. In those cases the situation is inverted: the generation of the score is deterministic, but the performance may vary.

An example of the last case is Duel, a composition that is based on game theory[6]. The composition is performed by two orchestras guided by two conductors, and is literary a game between the two that in the end yields a winner. Each conductor has to select for each move, one out of seven options that are predefined by the composer. A specific scoring system is established and the score of each orchestra depends on the choices of the two conductors[7]. The result of this process is that each time that the composition is performed, the outcome is different. Therefore, a deterministic system where there are seven specific and predefined elements is producing a result that varies in each performance of the score. To make things even more complicated, the seven predefined musical elements are composed by Xenakis with the use of stochastic processes. To summarize the structure of Duel: Xenakis generated seven different pieces using stochastic processes, therefore seven pieces that include randomness. However those pieces were finalized by the composer into a specific form. Then those pieces are given to the conductors that are free to choose one for each move of the performance. The choice of each conductor however is not random: “… it is [not] a case of improvised music, ‘aleatory’, to which I am absolutely opposed, for it represents among other things the total surrender of the composer. The Musical Game accords a certain liberty of choice to the two conductors but not to the instrumentalists; but this liberty is guided by the constraints of the Rules of the Game, and which must permit the music notated by the score to open out in almost unlimited multiplication.[8] So the choices of each conductor are based upon the strategy that he follows in order to win the game, and consequently upon the choices of the second conductor. Therefore the final performance of the score is different each time.

Xenakis was quite specific with his decisions regarding the use of deterministic or indeterministic processes. In most cases he employs models from both categories for each composition. More importantly, while he names his music “stochastic music”, the stochastic part, the place where randomness occurs, is always internal to the process and totally controlled by the composer. The final result - the score that reaches the performer, or even more the listener – is always specific. Even in the cases where the outcome may vary, it still does so in a set of predefined solutions already predicted by the composer.

02. Life Science

The study of morphogenesis, one of the most complex and amazing research topics of modern biology, aims to provide understanding on the processes that control the organized spatial distribution of cells during the embryonic development of an organism. In other words, how starting from a single fertilized egg, through cell division and specialization, the overall structure of the body anatomy is formed[9]. According to Deutsch and Dorman[10] there is a continuum of different approaches to the problem; on the one end there are theories of preformation and on the other end systems of self organization. The concept of preformation assumes that any form is preformed and static. Therefore any new form is always a result of a combination of the already existing forms. Taking a different approach, self –organization implies a de novo pattern formation that is dynamic and gets developed over time. Morphogenesis in the self-organization model depends on the interaction between the initial cells or units. Preformation is a top-to-bottom idea, while self-orgazination is a bottom-up system. In both cases, research uses computation as the necessary medium for the simulation of the biological processes. We will argue that according to the latest research in biology, morphogenesis can be approached as a process which involves both the notion of preformation as well as self-organization.

During morphogenesis, cells proliferate and specialize, i.e. they choose which subset of proteins to express. But how do cells know when to divide or where to specialize? How do cells know where to become skin or bone or how many fingers should they form? It turns out that the key to understanding morphogenesis is the way cells sense and respond to their environment.

Cells obtain information about their environment by using proteins embedded in their membrane to sense specific “message” proteins that are located around them. When such a “message” protein binds to a membrane protein, the cell “receives” the message and acts accordingly[11]. Therefore, during morphogenesis there is a constant interaction, through such “message” proteins, between each cell and its neighboring cells. This interaction helps cells to understand where in the body they are located, when should they divide and when do they need to specialize into some particular type of cell.

The above function, or better, sequence of functions, has been the focus of scientific research for the past two decades. Nowadays, molecular biology can accurately describe many steps of the reactions between proteins and how they are related to cell specialization[12]. It has been proved that these reactions follow specific physical laws which can be described by mathematical models. For example, given a pair of proteins, the properties of the resulting interactions are known. Because of the physical laws that are being applied, the model of the function of cells is a deterministic one, since it is made of many elementary interactions between proteins that have well defined inputs and outputs. Going this notion a step further, one could argue that the function of the cells implies the idea of preformation, that is, from two predefined elements only one possible combination can occur. In a way, the deterministic rules that the reactions of the proteins follow can be seen as a model of preformation, where there is only one output given a specific input.

Although the reactions between the proteins inside the cell follow specific rules that have been defined, yet there is a great degree of unpredictability at the life and function of each cell. Why science cannot predict exactly which will be the next “moves” of the cells, thus controlling all the functions in a (human) body? Although the nature of the outcome of the interaction between two proteins has been studied and analyzed, it is not possible to define deterministically when and where this interaction will take place, since proteins move randomly in space and they can interact only if they come into proximity and under proper relative orientation. This is true for proteins inside and outside the cell. Furthermore, it is not possible to define the exact location of neighboring interacting cells, when each cell will sense the presence of a “message” protein, when and how much will the cell respond to this signal by secreting its own “message” proteins, and when its neighbors will sense this signal. Given the number and complexity of the functions in each cell, as well as the vast possibilities of interactions with its neighboring cells, the large number of processes that could potentially happen cannot be expressed by deterministic models.

Since there is, to a certain degree, randomness in cellular functions, science turned to stochastic models in order to explain them. That is, instead of deterministic mathematical models, scientists use models that incorporate probabilities, in order to include the large amount of possible actions. Brownian motion, for example, is the stochastic model that describes the movement of particles in fluids, and therefore is used to describe the movement of proteins inside the cell. Stochastic processes can describe the spatial and temporal distribution of interactions inside cells and between neighboring cells.

To understand the importance of the stochastic component of cell function, here is another example: even though monozygotic twins have exactly the same DNA, they look similar but not identical. If the cell-response system was purely deterministic, then babies that have the same DNA should look identical. Nevertheless, this kind of twins look very much alike, but they are not identical. The small differences in their physical appearance occurred because of the stochastic nature of protein motion and interaction during morphogenesis. Even though the initial information was the same, and even though the outcome of protein reactions follows deterministic rules, the exact location of cells and proteins can only be described in a stochastic mode.

The stochastic part of cellular functions could, at a different framework, be seen as a model of self-organization. For people outside of the scientific biological community the introduction of randomness at research seems particularly intriguing. Instead of a process of preformation, (specific aspects of cells function that can be described by deterministic models) in the self-organizational model, cell function results in something different that cannot be described by deterministic rules. Cell functions depend on the fact that each cell is part of a whole. Together with their neighbor- cells, they react to external stimuli and to a large extent define the characteristics of the whole, as part of a bottom-up process. Deutch and Dorman, focus on the absence of distinction between organizer and organized in self-organized systems: “In self-organized systems there is no dichotomy between the organizer and the organized. Such systems can be characterized by antagonistic competition between interaction and instability[13]. To a certain extend, the cell functions acquire a self-organizational character, because the transformations depend on the interaction of the cells with each other.

There are many examples like the above in molecular biology to make the point that both deterministic and stochastic processes are used to describe the phenomena of life. As an overall assumption, many of the microscopic phenomena in life science follows deterministic rules, but the macroscopic outcomes can be described only in a stochastic way. Following this thought we argue that models of preformation and self-organization, as described above, can exist simultaneously in a system. The case of the cell-cell interaction in general and in the morphogenesis in particular depicts the complex processes that occur and highlights which part of the processes suggest a deterministic, preformatted model, and part of it follows a stochastic model of self-organization.

03. Design

The two cases we already examined – Xenakis work in musical composition and the study of morphogenesis in molecular biology – are both dependant to a great extent on the same medium: computation. Xenakis used computer as the means to transform mathematical models into music almost from the beginning of his career. At the same time, it would be impossible for researchers today to study the extremely complex phenomena that are involved in the development of life without the use of the computer.

The use of the computer, of course, has also become one of the main driving forces behind design today. The encounter of computation with design happened rather late and in the beginning took the form of an exploration of the formal possibilities that software packages were offering. That initial –maybe “immature” but still experimental– approach soon gave its place to a widely generalized dominance of digital means on every aspect related to architecture: from design strategies to the construction industry.

However, using the computer in an architectural context does not necessarily mean that we are taking advantage of the opportunities and the power that computation has to offer. More often than not, the use of computers in architecture today serves the purpose of the “computerization” of already predefined processes and practices – aiming usually to render them more efficient or less time consuming. That might be convenient, it does not promote however the invention of new ways to think about architecture. As Kostas Terzidis notes, “computerization is the act of entering, processing or storing information in a computer… [while] … computation is about the exploration of indeterminate, vague, unclear and often ill-defined processes[14]. While automating and mechanizing every-day architectural tasks may be useful, the true gain for architecture in relation to digital media lies in the understanding of what a computational design process can really be. Only this way we can use computers in order to explore the ‘unknown’; in order to invent new architectures. We believe that the already mentioned examples in biology and the music of Xenakis can offer the means to better understand where these creative possibilities of computation are laying. Of course computation has numerous applications in several different fields, the selection of those two specific cases as guidelines however, has a very specific motivation: The biological approach is providing scientific, highly developed techniques that have been tested thoroughly and at the same time can show us how computation and digital tools can become the bridge between complex processes taking place in the physical world and the way that space is created. Xenakis’ work on the other hand, is an example of computational techniques used outside their strict scientific origins; that is in order to compose a musical score. Therefore they can provide insights about how those methods can be used in order to create an art-form.

The work of Xenakis is pointing out one of the most import aspects that computation is bringing into (architectural or musical) composition: the introduction of randomness. One can argue of course that architects always had to take decision based on chance. However humans are not really able of creating something totally random. For example if we ask somebody to draw a random line; between the decision to draw a line and the action of drawing the line, there are several different layers that affect the result: ones idea of what a line is, ones’ idea of what random is, the interpretation of the phrase “draw a random line” etc. On the contrary computers are very good at producing randomness (see figure 02: stochastic algorithm positioning and scaling a box randomly). If we program a computer to draw a random line, then the computer will simply draw a line without anything external interfering between the command and the action. The ability to produce randomness, combined with the ability to perform complex calculations is defining the power of computational stochastic processes. And as we have already seen with the work of Xenakis, randomness can be controlled. Therefore the architect/programmer can specify specific rules or define the range within which the stochastic process will take place and then the computer will execute the process and produce results that satisfy the initial conditions. The advantage of such a process lays in the fact that through randomness the architect can be detached from any preconceptions that he or she may have about what the result should be, and therefore it will become easier to generate solutions initially unpredicted. By defining the rules and letting the computer generate the results, we open ourselves to a field of almost endless possibilities; designers can produce results that they couldn’t even imagine in the beginning of the process, while they can still maintain control of the process and the criteria that should be met.

Figure 02: Stochastic distribution. Object-e architecture: space_sound. 2007.


The most important aspect that is crucial to realize, and Xenakis’ work is making it easier to see, is that the power of algorithms used in composition lays in the process and not in the final result. If it is a building that we need to produce, then a building might be produced in the end, as a musical composition was produced in the end by Xenakis. However the importance is moving from the final result to the process that we use to generate it; the architect is not designing the building anymore, but the process that generates it. To point it out once again: computation is defining a process, not an output. And exactly because it allows us to focus on the process and not the results, those results can be unexpected.

Figure 03: student: Josie Kressner


While Xenakis emphasizes the importance of process over the final output along with the stochastic properties of algorithms, the example from biology, when applied to architecture, is highlighting another important aspect that the use of algorithms is raising: that of self-organization. Architectural tradition, starting with the renaissance, is heavily based upon the idea of the “master”: an architect with a specific vision, which he or she materializes through his/her designs creating subsequently a style. Self-organization however implies a totally different idea: the architect does not actualize through his design something that he or she has already conceived. On the contrary: the architect creates the rules, specifies the parameters and runs the algorithm; the output is defined indirectly. Through computation and after many iterations, even the simplest rules can provide extremely complex results, which are usually unpredictable. Moreover, by a simple change in the rules something totally different may arise. The top-bottom idea of architecture, with the architect being at the top level and his/her creations being at the bottom, is inverted: the process begins from the bottom. Simple elements interact with each other locally, and through the iterative application of simple rules, complex patterns start to emerge. The architect does not design the object anymore, but is designing the system that will generate the final output.

An example of a pattern formation model with self-organizational properties is that of cellular automata, which are extensively in use in several different fields, and lately also in architecture. A cellular automaton is a self organized system where complex formations arise as a result of the interaction and the relations between the individual elements. The simplicity of the model combined with its abilities to produce very complex results and to simulate a very wide range of different phenomena makes it a very powerful tool that allows the architect to be disengaged from the creation of a specific output and to focus instead on the definition of a process. (see figure 04: a one dimensional ca and a surface generated byt the acceleration of the rules.)

Figure 04: student: Lauren Matrka


The possibilities arising for architecture are virtually infinite: from the creation of self-organized and self-sustained ecosystems to the study and planning of urban growth. In the place of externally imposed “sustainable” rules, we can have internally defined rules that form the generative process. In the place of applying external “planning” strategies to the cities, we can study urban entities as organisms, as system that are growing following specific rules that define the interaction between its elements.

Yet, as noted in the example of the protein interaction, self-organization is not encountered on its own. It is always functioning together with other, deterministic or pre-formed, systems. The same way that Xenakis was using indeterministic processes in relation to deterministic ones.

What stochastic processes and self organization are offering to architecture, are the means to engage the unknown, the unexpected. The means to move away from any preconceptions that define what architecture is or should be, towards processes the explore what architectures can be. As Marcos Novak writes, “by placing an unexpected modifier x next to an entity y that is presumed – but perhaps only presumed – to be known, a creative instability is produced, asking, ‘how y can be x?’[15]. In that sense, by placing models of self organization next to models of preformation, or stochastic processes next to deterministic processes, we are not only inventing new systems or new architectures, but we are also discovering new qualities of the systems that we already know – or we thought that we know.



[1] see Xenakis, I. Formalized Music: Thought and Mathematics in Composition New York: Pendragon Press, 1992.

[2] see Varga, B.A. Conversations with Iannis Xenakis London: Faber and Faber limited, 1996, p.76.

[3] In the “deterministic” category of Xenakis’ work fall compositions like Akrata, Nomos Alpha and Nomos Gamma, while examples of the “indeterministic” approach can be found in compositions like N’Shima (Brownian Motion) and Analologigues (Markov Chains).

[4] Brownian motion in mathematics (also Wiener process) is a continuous-time stochastic process. In physics it is used in order to describe the random movement of particles suspended in a liquid or gas.

[5] Figures 1-2: from the project space_sound, Object-e architecture, 2007. Figures 3-4: student work, School of Architecture, Washington University in St. Louis.

[6] Game theory is a branch of applied mathematics that studies the behavior in strategic situations, where an individual's success in making choices depends on the choices of others.

[7] For a detail description of Duel see Xenakis, I Formalized Music: Thought and Mathematics in Composition New York: Pendragon Press, 1992. p. 113 – 122.

[8] Xenakis, I. Letter to Witold Rowicki. see Matossian, N. Xenakis New York: Taplinger Publishing Co. 1986, pp. 164-165.

[10] see Deutch A. & Dorman S. Cellular Automaton; Modeling of Biological Pattern Formation, Boston: Birkhauser, 2005.

[11] see Sudava D., et al, Life: The science of Biology, Freeman Company Publishers, 2006.

[12] see Lodish H., et al, Molecular cell biology, Freeman Company Publishers, 2007.

[13] see Deutch A. & Dorman S., Cellular Automaton; Modeling of Biological Pattern Formation, Boston: Birkhauser, 2005.

[14] see Terzidis, K. Expressive Form, a Conceptual Approach to Computational Design New York: Spon Press, 2003, p.67.

[15] see Novak, M. “Speciation, Transvergence, Allogenesis: Notes on the Production of the Alien” AD vol 72 No 3 Reflexive Architecture, Spiller N. (ed.) London: Wiley Academy, 2002, p.65.

Read more!

Monday, June 2, 2008

Architecture, science and the social: A conversation with Antoine Picon.

by Dimitris Gourdoukis

Antoine Picon is a historian and theoretician that has focused with his writings mainly on the relation between architecture and science, or architecture and the technologies. A relation that in the current situation of architecture is extremely important, if we think about the impact of the use of computers in architectural design and the ‘uncertainty’ that new technologies brought into architectural practice. Writer of several books and numerous articles, while at the same time a Professor of the History of Architecture and Technology and Director of Doctoral Programs at the GSD, Antoine Picon is offering a better understanding of that relation, through its history, but also through a critical approach to its current condition.

This discussion presented here on the t-machine, took place in the spring of 2006, at Washington University in St Louis. The questions were made together with Matthew Toth.

DG: You write often about the relation of architecture to science. Also it is obvious that during the last years this relation has been strengthened. Why do you think this is happening?

AP: I think there are various reasons at various levels. The first one I think is related to the very specific context in architecture. I would say that we are in the middle of a state of incertitude regarding guidelines for architectural practice and reflection today. We are no longer entirely modern, we have abandoned the post-modernism, and even the kind of Koolhaas global architecture is wearing thin. So there is a need of trying to find one guiding principle. That would be a first reason. Probably a second reason is that science seems to be unfolding today a pretty fascinating world, which is pretty much tuned with what people perceive, for example, a world that is both computable and totally unpredictable, which is something that quite strikes me. If you take the financial market, for example, they are totally computerized and totally unpredictable. In a way, science is closer to some of the fundamental intuition that we have of our world. So this is another reason. And then a third reason is of course the computer, and the advance of digital culture and the possibility of circulation of models between the two domains. So I think that it is the convergence of all that that might explain this renewed interest in science today.

DG: You often argue that architecture was always virtual in the sense of design. How is that virtuality of architecture related to what we encounter as virtual in architecture today, in the computer? Is it the same notion or are we referring to two different notions that just happens to be described with the word?

AP: Well, historians will tell you that it is an expression of the same thing. I think virtuality is a fundamental component of architecture. This is what I tried to explain in the text you are referring to. Architecture is always as much a promise of unfolding as something that is already there; it is always something that is in between the ‘there’ and the ‘not yet there.’ I think this is something that has not changed. What has changed is the way it manifests itself. In the article that you mentioned (Architecture and Science) I mention how ornament was part of the former virtuality of architecture, which today is no longer the case. To go back to the problem of computation and form, the virtual has more to do with the relation between computation and poetics, in some ways, which is something relatively new.

DG: But in a way is it the same idea and the computer is only the medium to helps us experience it?

AP: No I strongly believe that the computer alters the way, it alters even what we call materiality. The computer is as much a machine that redefines the experience we have of the world, as it is a pure computing device. So the experiential level, our relation to the physical world, is changing because of the computer. An example I always like to give is zoomability: we used to live in a world that images were static; now we are more and more accustomed to images that are clickable, and in a way an image that is not clickable is strange. So clickability, zoomability, has become a natural condition. It is this experiential dimension for me that is really altering. I mentioned images, but there are others too. Our relation to light is changing, because now light is something we can manipulate, in a much more detailed way, we can parameterize it. So a lot of things are changing and thus the way we perceive architecture is changing profoundly as well as the kind of promise architecture is about.

MT: Does that affect the way we produce architecture too? Or the way we perceive the images?

AP: Absolutely, although we do not know yet to what extend the way we produce architecture is going to change. I believe it is going to change enormously, but we have not yet fully realized that. For example, since everything in a computer process is theoretically reversible, you have to make choices, and the computer forces you to be more strategic. That is why the strategy of the path you follow is becoming more and more important. The computer is a machine that can produce so many different scenarios, that the selection is more crucial than before.

MT: It seems to me that the reverse would be true, that before computers you needed a strategy because you could only produce one variation, but now you can have 15 different variations, variations in the colors of the façade just by some quick clicks.

AP: Yes, but you can argue that it is a little bit like consumer culture, before, when you had only one product to buy, you just bought the product. If you have 50 available, you will have to ask yourself “why am I going to buy this?” The diversity of the computer forces you, in a way, to have a clear vision of what you want, in order not to get lost in the indefinite diversity. To give another example that always strikes me, think of the writing of a text: before, typing a text, using the typewriter, was such an ordeal that when you were done with it, you usually kept the text as it was, except if really there was something totally wrong. Today you can always redo or modify, so you have to make a decision on when this is over, on when the text is finished. It might look more arbitrary, but it is really what I call a more strategic decision. And for that you have also to set your goals more clearly, because the production of the text becomes so flexible, you can cut, you can paste, so you have to define what the aim you are pursuing is.

DG: There seems to be some certain thinkers connected with these new conditions in architecture, like Deleuze and Bergson. How do you think that those things come together? Do you believe that they can provide a theoretical background for digital architecture?

AP: Hmmm… I’ll be very honest with you. I like philosophy, but I think philosophy is always a provisory dress for architecture, and that architecture has got to find its own self-motivation. Of course there are intuitions in Deleuze, from Mille Plateaux to Le Pli, or later, that are in tune with some of what architects are pursuing. However, I am not sure that this is going to provide The Theory, because I think architecture needs to provide its own understanding of things.

DG: So it just provides inspiration?

AP: Yes, it provides a provisory to theorize things in a way that we have not yet totally come to terms with. That’s at least my take of the question. Of course there are a few things like what Deleuze writes on ornament, etc, that are in correspondence, which is natural, as architecture is a cultural production, is intoned with all other cultural production, but I do not think that architecture has never found in Emmanuel Kant or in Hegel its ultimate justification. It could be inspired by some of the concept of philosophy, but then it has to be a unique understanding of things.

DG: Then, do we need an architectural theory today?

AP: It depends on what you call theory. If you call theory a closed system of principles, then no, I do not think we need that. If we speak of guidelines, of a way to define objectives, then yes. To give another example, which is also one of my obsessions, I am pretty convinced that we have also to address social issues today. They’re back, so to say. And blobs are not enough. The real question is how to articulate new ways to do architecture, with new expectations from the people and so forth. It cannot be cynical any more in the kind of discourse in global architecture, or fashion design, that you find from Koolhaas to Van Berkel- even Koolhaas is changing these days. So I do not think we need a theory, but we need to reassess the way architecture relates to the social demand. We need to see how to relate to the social demand without falling back to the utopian discourse of modernity. So, I think architecture is in need of theoretical questions, probably more than theoretical answers.

MT: Is this architectural theory something that’s built or something that is written?

AP: I think it’s always in between the two. Architecture is something that goes back and forth between material production, very material production and lighting, images, and feeds on both. I think it feeds both on very material realization and imagination. Ahhh… How could I state it? Architecture must be inspiring. And you are not inspiring if you are only a beautiful object, you are inspiring if you suggest a different world, if you make proposals for a different world, a different way to perceive things. So there is always the need of the two: architecture is always both in the building and in the commentary of the building, if you like.

MT: So, is it possible to build a theory?

AP: Again, it depends on what you call a theory. I would say rather than a theory as a corpus of principles, something that identifies what matters at a certain point. If you look at Le Corbusier buildings, there is a certain number of things that matter, and “Theory” is about exploring what matters for a certain type of architecture at a given moment.

DG: Let’s go back to the notion of materiality in architecture; we could say that today we have reached a point where we can design the materials we are going to use. So, in an extreme situation, it is possible for an architect to design something and then say “now let’s create the materials to build it,” which I suppose changes the way we understand materiality. But at the same time using the computer, we find a new kind of materiality. When you have to work with a certain software package and use the geometries it employs (NURBS for example), you have to understand in a way the materiality of the software. So where do those things bring us, and how could we understand materiality in architecture?

AP: I think what you say is true, today we can design materials, although that doesn’t happen for the first time, today it is possible to an extend that was not before. Another thing is that today materiality is extremely abstract and extremely concrete. It is both in the software and in the codes of the software as well as in the very things you are going to touch. So the new materiality has probably to do with those very conflicting categories. That said I do not think that we are at a stage where we know towards where we are actually going.

DG: But you do recognize it as an important issue?

AP: I think actually it is one of the most important issues today. We are defined by the way we define what is not us, and materiality is very much about what is not us. Therefore there is this strange relation between redefining what is materiality and what is man. And we are in a period where the definition of man is changing.

DG: Could we also say that the computer gives a different value to the idea of materiality? An architect that has to work on the computer, in a way becomes a craftsman in order to understand the properties of the things he is working with. I suppose we used to think of the architect as someone that does not go to that level of craftsmanship.

AP: I am not necessarily persuaded… I will mention Kostas Terzidis, he thinks that architecture goes to an algorithmic level… I am not convinced this is always necessary. I think you can do a lot of things without knowing their principles. What you need then is to have a clear idea of what you want to achieve so that you are not trapped totally in what the machine or the tool wants. If you do not want to be a prisoner of the tools, there are various things you can do, one is to know perfectly well all the internal logic of the tool; another thing is to define your goal and choose the tool in function of the goal, which is a different way. I believe probably architects are prone to the second approach. The new generation that will come will be incomparably savvier in computer coding. But that said, I do not think that they will always have the time to play with the code, etc. because in the design profession there are so many other things to do. So my guess is we should return to the idea of the strategic, and that architecture is very much in need in defining better its goals today. The question is what you want to achieve with design, because it is clear you cannot achieve everything.

MT: That is why, as you say, the social issues are becoming the focus of the discourse again?

AP: Absolutely. Today we have reached a state where we have to redefine what architecture brings, or the question becomes too complex… Architecture was never a purely philanthropic activity, it is as much an art, but a bizarre social art, in which there is always the ambition to reshape sociability and society. So it’s a bizarre thing. It is more complex than planning. Planning is totally good citizenship… Architecture is more perverse. Today we have to reevaluate both all the internal goals of architecture and how do they relate to other goals which are of more social nature.

DG: There also seems to be an obsession with the form or the image. The visualization that the computer is producing.

AP: Yes, I think it is normal in some ways. Because we believe that beyond the images there is something; if it was just for the sake of the images we would be “ok, so what?” But I think beyond that, there are things at stake. A comparison I use from times to times is with the Renaissance, not necessarily because we are in a new Renaissance. When people were playing with perspective, they were playing with images, and they were totally fascinated with images. But images are important. They do reshape the world in which we live. So I would say that we do not know exactly how computer images are going to reshape the world in which we live, but we are pretty much there, already. So a play with images may seem a little bit superficial, and indeed some are totally fetishizing the image, but I think there is something deeper.

DG: Is the image always a part of architecture?

AP: Architecture has to do with images and there are two functions for images. The first one is to synthesize heterogeneity. This is why the architect reasons through images, because the problem of design is that it has to take into account very heterogeneous factors and problems. This is where it is different to engineering, engineering usually deals with relatively homogeneous types of problems. Of course any technical system can be extremely complex but it is more univocal than architecture, which has to synthesize very diverse things. And an image is something that unifies, you can put extremely different things on an image, and the fact that they are on an image unifies them. Look at a surrealist image, you have an apple next to a locomotive, but bizarrely because it is transformed into an image it makes some sense. So I think architecture uses images for that purpose. Also because images are part of the social imaginary, part of what people expect from the world. I think architecture is also a play on the expectation, raising expectation of meaning etc.

MT: Is the computer image the latest model of architectural style or trend? I am thinking of automotive design over the past century and the lifestyle implications, and the way a particular design suggests a certain society.

AP: Yes, and do not forget that the computer image is totally in continuity with videogames and that kind of things, which are powerful images today. Even if you look at the way we tend to circulate in models today, a little bit like Super Mario… And it has an impact on architecture; you could very well argue that the Foreign Office Yokohama Terminal is a manifestation of the age of video games, sliding, going up and down, attracted by topological holes, attracted by that kind of things. A French poet once said: “Nothing is more profound than skin”, images are just like skin, both totally superficial and extremely profound.

MT: When people think of computer designed buildings, they have a certain image associated with that and in that sense people in that line of work are pretty successful. You mentioned blob architecture. Is this image of the computer design important, is this a lasting image, is it going to be the next legacy, or is there something else about computers more important? Is it the process about using computers that will have the real impact? Is the multifaceted skin portrayed in three dimensions and constructed, is that what is important, that we are taking away from computers, or is there something else?

AP: I think we are taking away a lot of other things. Frankly, I do not know, I am not a prophet. I am interested in digital architecture regardless, not necessarily as the ultimate answer we can bring to questions, but more in what questions it raises. For example if you take the blobs, the blobs are what they are, they raise a couple of interesting issues, for example this issue of formal freedom, what is formal freedom today. Another issue raised is aesthetic judgment, we do not know if a blob is beautiful or not. That is another interesting question. So they raise a couple of interesting questions. That said, I am not sure if the blob is the only solution we can bring. I recently wrote a piece for a friend of mine, the architecture of whom is probably still in its infancy, but his idea is that we could also envisage a relation between thecomputer and the virtual, that could rather lead us to a new minimalism. I would say you have the blob in the one hand but minimalism is also an answer today. I do not think that the blobs are the only solution. I think they raised interesting questions. But I do not think they are necessarily the only future we have in front of us. I happen to be pretty eclectic on that matter. There are some blobish people I am interested in and others that I am not at all interested. Some projects of Jesse Reiser or Goulthorpe are even in a plastic term interesting, probably less convinced by some of Lynn’s creations, although the most recent ones, the more ornamental ones, are probably more interesting than the former ones. But we do not know yet where it is leading, and the worst thing to do is to be trapped believing that this is the truth. The good thing is that it has a kind of experimental dimension. It is a good time to be a young architect. For example there are all the things, you take sustainability, nobody knows what sustainability really means in architecture, and that’s why to give a meaning to sustainability will be a major challenge. And probably the computer will be one of the dimensions involved in sustainability today. To go back to architecture and science, one of the things that both architecture and science share today is this experimental dimension. Architecture is more than ever an experimental practice.

MT: So, what is the relation of architecture to sustainability?

AP: One of my strange obsessions these days is that the new limit of our world is no longer the digital. In the 1950s the digital was the frontier. The new frontier of the world has more to do with the levies of New Orleans, global warming, a lot of very concrete, old fashioned stuff that have to do with mechanics, hydraulics, that kind of things. And probably one way we have to cope with them is through an extreme level of calculation, and this is where the computer comes back into the picture. It is not the computer as a machine only. In order to find a solution, we better figure out a way to be extremely smart. But if you are to think like that, even a can of coke, remember there is this classical study on the can of coke, the metal comes from Australia, then it is treated in Sweden, the soda comes from one place… it is totally absurd. Even the can of coke. Today if the entire planet was to consume batteries in the rate it is done in the Western world, there would not be enough metal, cadmium. To figure out a solution it will take a lot of intelligence.

DG: What would be the role of a historian in these new conditions in architecture?

AP: To be honest, I do not think that the period now in architecture is especially prone to historical thinking. It is pretty clear that there is a steady decline in the presence of history in the schools of architecture, I am conscious of that. Contrary to some of my colleagues, I am not going to cry about it, first of all you have to be realistic and ask yourself why this is so, and it is not because students are not more stupid than what they used to be. I would say that the stupidity of student/professor is pretty constant from one generation to another. I think it has to do precisely with this period being so experimental. Contrary to the post modern where the problems were essentially seen as linguistic, you do not seem to need as much history today. That said, I do not think that history, I might be wrong, but I do think that history will ever disappear from architecture for various reasons. One is, I believe, that architecture is as much a tradition as a discipline, and it has to constantly rethink critically of what it has achieved in the past. Its definition of the past can vary; Babylonian architecture is of little interest to a school of architecture, let’s be clear, even Gothic is to the limit. But, rethink critically about itself is an important dimension to architecture. This is the first reason to be optimistic. Second reason is that I believe that some critical thinking is quite necessary, especially in a profession that is going to change a lot, which does not have the easiest economical and professional condition on earth. I think history can help a student to be a little more aware of himself and aware of the difficulty to make choices. I do not believe that history provides you with ready made solutions but history is, strangely, the study of the indetermination of the present. That is also why I am interested in the virtual. The fact that each present is full of potential. So, strangely, I believe history should be a lesson in freedom in the schools of architecture. You wouldn’t be a designer if you thought that everything you designed is totally determined, and over determined. History is really an exploration of this freedom of the designer. After all, it is not a drama that doesn’t attract crowds, there is the need of its presence. To give you a very personal thing, after the Croatian war, you know all the things that happened in former Yugoslavia, I felt a little despaired… You would have thought that people who had experienced the horror of the concentration camp and that kind of things, during the Second World War, would not do ethnic cleaning. This means one thing: that people have not necessarily learned. It is not because you show two slides of concentration camps to an entire generation of kids, that these same kids when adults, will not do horrible things. So do not expect history to be efficient in that respect. That said, if you stop producing history, and by history I do not just mean books for public at large, but research in history, then, gradually the exigency of truth disappears. Then, gradually, you will have people that will say, with total impunity that camps never existed, and so forth.

MT: Well, we have those people today.

AP: Yes, but fortunately they are checked by archives and historical research. Even in architecture, to be reminded that the latest fashion is not the ultimate truth, is also a good thing.

MT: If we can find an ultimate truth…

AP: Precisely, I do not think you can ever find an ultimate truth. But at least it becomes difficult to sell it to others as the ultimate truth. I think it plays a role of sanitation of the debate.

DG: It might also be an opportunity for history to be re-approached, since it is becoming less important, the chance of seeing architecture again and approach it in a different way or teach it in a different way.

AP: Yes, for historians also it creates interesting questions, because you have to ask yourself if the discipline’s position is weaker than it used to be, then it probably means that it has got to be practiced in a slightly different way. It’s not just a question for architects, it is also a question for historians.

MT: It is interesting to see the impact of speed in history. People have talked about the Post Modern era or whatever era we are in, in terms of an increasing speed of exchange and rate. Thinking of what is important to an architect practicing now, taking the argument for instance, which is not my stance, that Gothic or Greek architecture is no longer important, even modern architecture; we are doing computer renderings, then history is still equally as important, it is just the time, it approaches something of news.

AP: Let’s be clear. Let’s go back to the Greek, not only because we have a Greek person here. Not that I would argue that everything is [necessary] you could dispense with the Greek if you want and with the Gothic, but that said, if you want to understand Mies truly, then Mies is totally indebted to Neo classicism and, guess what, Neo-Classicism is indebted to a certain reading of Greece. If you want to understand Le Corbusier, the Acropolis experience, just like for many people of his generation, was totally essential. I would say you cannot know everything in the world, so there is not a single knowledge that is absolutely irreplaceable. But I would say, once you begin to be interested in the complexities of understanding what you are, then a lot of things reappear, because you are trying to understand where you come from. By the way, the reading we have of Greece is very far from what the Greeks had in mind at their time. So in a way, it is more about rethinking from what tradition we come from. And do not forget that these questions of lineage, paternity and so forth, are still so important in the psychology of people. I think for an architect to know where he comes from, and it is not necessary from the Greek, you can have other lineages, I think an architect has to reflect from where he sees himself coming from. But again, today, very few things are utterly irreplaceable. One of the merits of history is probably that it is the less systematic of the humanities and social sciences. Which is also why it goes pretty well with architecture. Where sociology was always more problematic because sociology is much more dogmatic so it goes less well with architecture.

MT: I would like to ask a last question. What do you think is important in the education of an architect, what is your advice for an architecture student?

AP: Well, do not forget that I was trained first in Science and engineering, so I am a hybrid… I do believe the core is the design practice, the studio is to me the core of architectural education, and it might be strange for this to come from a historian, but I strongly believe it. The core is not necessarily something that should contaminate everything else. You must have as an historian in the school of architecture, my firm belief is that, that in perspective it must be “useful” between brackets for students, but “usefulness” can be pretty distinct from direct application. I do not believe that a history class is directly applicable. I think it has a meaning to aim at.

I do not know whether I have really an advice for an architect. I will make the following remark: Which is that, the most difficult thing in architecture is to find the right proportion between being critical and a-critical. If you are too critical, you do not do anything, if you are too a-critical, you do stupid things. No, I am not joking. If you take design practice, design practice is not very critical contrary to what architects say very often, architects tend to resent… You still have the temptation to make design research as strictly equivalent to academic research. I think it is not true. Because design research has its own a-critical stands, because it tries to produce something. Which is why you must have disciplines like history and others that give you this critical dimension. And you must find your own way to manage these two so contradictory impulses. Just like the good architect is a bizarre blend of a man of action and an intellectual. Architecture is a form of action, but at the same time, you must be an intellectual, which is not very simple. To me this is why architecture is ultimately political. A great politician is a man of action, that gets the job done, but also has very contemplative stands that enable them to set vision, goals and so forth. Architecture is a little bit like that, and this is very hard to teach. This is why self-teaching is so important in architecture, because you must find the right equilibrium for you, and that’s not very simple. Especially these days, when I mention the rise of the strategic, I think there is this very strong urge to find new ways to be both a-critical and critical. I do not buy totally Sarah Whiting’s position on the post-critical, I think we still need to be critical, but there is also the need to do things without criticizing every step. It created a small debate in the east coast [the post-critical], not very fundamental. I think it is not so simple to be an architect, because you are neither in the realm of knowledge, and you are neither totally in the realm of pure action, and you are proposing always something that is in between the two.

DG & MT: Thank you very much for taking the time to talk to us.

AP: Well, thank you.

Read more!