23
branded pigs - http://www.nextnature.net/?p=3660 prss, by Edwin Gardner + Marten Dashorst + Lukas Pauer. Images para legrandcrew .1 — David Gissen (Postopolis! LA) / David Gissen is a historian in the most untraditional sense: he looks at preserving traces of the past in the present, whether it’s airconditioning in New York, or smoke in Pittsburgh, and proposes to design them as future heritage. .2 — Should We Abandon the “Uncreative Class”? / Richard Florida’s Creative Class has become a doctrine for those who want to make successful cities, but how inclusive is this success? Because, we can’t possibly all become member of the jet-setting and job-hopping crowd, can we? .3 — Seeing Is Believing / Simple renders of how a house can be lived in is no longer enough: if we buy a house nowadays we want to see and believe how we can live in it. Enter the ‘resident manager’. .4 — 60. Play Peter, the Pritzker Peddling Hermit Genius / why being a famous and well respected architect is as much about making good architecture as it is about doing good self-promotion. “Good work + Good Promotion = Fame & Recognition.” p rss release #26 ,july 10 2009 the independent paper blog aggregator .5 — Measuring the Immeasurable / “why do we measure the things we measure, and why do some things get to be measured while other things do not get measured, or are seen as immeasurable?" .6 — Fix the Pritzker / Are the Pritzker Price's original intentions - stimulating awareness of buildings and inspiring creativity within architecture - still valid now that architecture has become so much more than simply the genius vision of one man or one woman? .7 — What’s A ‘Spooey’? A Field Guide To Freeway Interchanges, Part 1 / Know your cuts of highway interchanges. Cloverleaf? Easy. But what in the name is a 'Spooey'? .8 — Incremental Housing Strategy in India / Filipe Balestra & Sara Göransson / How to design sustainable and afforable housing without architects.Community-based architecture in India. .9 — What comes first: the chicken or the egg? Pattern Formation Models in Biology, Music and Design. / A move away from the known preconceptions in architecture can give us new insights, not only in the product of architecture, but in the process of architecture as well. .10 — Organization and intelligence / Similar to the German army's blitzkrieg tactics in WW2, what would happen if cities let go off their strict organizational models for urban growth, and replaced them with networked, non-hierarchical emergent systems?

prss release #26

Embed Size (px)

DESCRIPTION

independent paper blog aggregator

Citation preview

Page 1: prss release #26

branded pigs - http://www.nextnature.net/?p=3660

prss, by Edwin Gardner + Marten Dashorst + Lukas Pauer. Images para legrandcrew

.1 — David Gissen (Postopolis! LA) / David Gissen is a historian in the most untraditional sense: he looks at preserving traces of the past in the present, whether it’s airconditioning in New York, or smoke in Pittsburgh, and proposes to design them as future heritage.

.2 — Should We Abandon the “Uncreative Class”? / Richard Florida’s Creative Class has become a doctrine for those who want to make successful cities, but how inclusive is this success? Because, we can’t possibly all become member of the jet-setting and job-hopping crowd, can we?

.3 — Seeing Is Believing / Simple renders of how a house can be lived in is no longer enough: if we buy a house nowadays we want to see and believe how we can live in it. Enter the ‘resident manager’.

.4 — 60. Play Peter, the Pritzker Peddling Hermit Genius / why being a famous and well respected architect is as much about making good architecture as it is about doing good self-promotion. “Good work + Good Promotion = Fame & Recognition.”

prss release #26,july 10 2009 the independent paper blog aggregator

.5 — Measuring the Immeasurable / “why do we measure the things we measure, and why do some things get to be measured while other things do not get measured, or are seen as immeasurable?"

.6 — Fix the Pritzker / Are the Pritzker Price's original intentions - stimulating awareness of buildings and inspiring creativity within architecture - still valid now that architecture has become so much more than simply the genius vision of one man or one woman?

.7 — What’s A ‘Spooey’? A Field Guide To Freeway Interchanges, Part 1 / Know your cuts of highway interchanges. Cloverleaf? Easy. But what in the name is a 'Spooey'?

.8 — Incremental Housing Strategy in India / Filipe Balestra & Sara Göransson / How to design sustainable and afforable housing without architects.Community-based architecture in India.

.9 — What comes first: the chicken or the egg? Pattern Formation Models in Biology, Music and Design. / A move away from the known preconceptions in architecture can give us new insights, not only in the product of architecture, but in the process of architecture as well.

.10 — Organization and intelligence / Similar to the German army's blitzkrieg tactics in WW2, what would happen if cities let go off their strict organizational models for urban growth, and replaced them with networked, non-hierarchical emergent systems?

Page 2: prss release #26

1 David Gissen (Postopolis! LA)

City of Soundhttp://americancity.org/daily/entry/1601/by Dan Hill on May 26, 2009

David Gissen delivered one of my favourite talks at Postopolis! LA, for sure. Gissen is a historian - yet lest that conjure up a certain image - an AJP Taylor, EP Thompson, or Eric Hobsbawm, bless ‘em - he actually cuts a very different kind of figure: exploratory, intrinsically multidisciplinary, and given to speculative imagination. Gissen delivered a fascinating, illuminating and often funny presentation which utterly reconfigured ideas of preservation and historical research.

He started, as few others did funnily enough, by quoting Nietzsche. In stating “scholars are not human beings”, Gissen was evoking the peculiarly tortured mode that academics, researchers, scholars can end up in. He suggested that “the more you go into the archive, the less you have to say about how people should live” and wonders whether perhaps the “philosopher and architect has more to offer”. He ponders whether his kind of research can indeed lead to “becoming something less than human being,” after Nietzsche. (He’s playing this up a little much, but anyone who’s done a Phd, or witnessed a friend go through one, will know the truth of this.)

By way of contrast, he brings up Reyner Banham (which others also did funnily enough), noting his “name gets tossed around a lot”, and how different his image was. “(Banham) constantly photographs himself in a car, on a bike, in the desert - never in an archive. He’s trying to show you that one can live as a scholar that’s very different from what we expect.”

Moving on to projects, he starts with that of his colleague Jorge Ateropalos Jorge Otero Pailos (corrected, thanks Javier), who conducts studies of dirt and pollution, and here we start hearing about attempts to reinvent the idea of preservation. Ateropalos proposed making preservations of pollution in a factory in Balsano, Italy. He casts the dirt on walls as a way of understanding - through preservation - the lived experience of the factory, the behaviour of the factory.

Gissen then mentions Michael Caratzas, and his project for the Cross-Bronx Thruway (an infamous Robert Moses project.) He proposes preserving the entire Cross-Bronx Expressway highway, rather than demolishing it. “What if you did preserve it?”, Gissen asks. Preserving means “to keep it in good shape and also an aspect of history …” It means both maintenance and understanding.

(It’s interesting to extend the idea of preservation in this direction. I’m personally opposed to preservation in the form of unnecessary heritage protection, taking the Cedric Price view of heritage (“York Minister? Flatten it.”) such that our cities continue to progress. But the preservation at work here is quite different to heritage.)

Gissen’s HTC Experiments (a “bloggish thing I started about 6 months ago”), is a great ongoing exloration of these issues - what he describes as “history as a form of experimentation”. One such project is located around a building for the California College of Arts in San Francisco, which is an adaptive re-use of a former bus shed. The building would have been full of fumes in its former life. Yet now that’s all gone. It’s one of two remaining of its type. In a sense, he says, the gentrification of such spaces means literally “less exhaust”. The buses and smokestacks disappear, taking the smoke with them. Gentrification “changes the actual atmosphere of a portion of the city.”

His proposal is to “reconstruct an exhaust plume as a sculptural element”. In effect, to denote this shift by illustrating what was once there, at least symbolically. (This is an idea I enjoy, and have used in my own work to some degree, exploring how the air quality in the Green Square Town Centre urban renewal project Sydney will have radically changed since its days as site of the city’s primary incinerator, a space subject to decades of community concerns over air quality. I proposed real-time air-quality monitoring, projecting data visualisations onto the defunct incinerator chimneys. Leaving aside issues of representation and the form of ‘progress’ it purports to entail, it’s interesting to note that gentrification does at least mean cleaner air.)

Page 3: prss release #26

Similarly, Gissen describes a project for Pittsburgh, to “make Pittsburghians think about how the space has changed”. To understand the history of the city through the history of the air of the city. Particularly where “the air that’s been in the city is really quite foul.” Gissen asks “What if you just projected the atmosphere that used to be in the city?” He shows an illustration, looking akin to a Superstudio project, but made out of smoke. He then decided to make “a less utopian image” (less heroically Superstudio, in that sense) and showed a progression that’s “like a balloon that’s made of smoke (to indicate that it’s) actually a really positive sign. It makes you realise how the city has transformed, (and how) the atmosphere is designed in a way that’s different from the one that was there before.”

Gissen then shows another project .. “to reconstruct floating bathhouses in the Hudson” (and thus addressing water quality), and another, centred around a chance to explore the history of the park. His proposal here was to create “a history of the park in which the park doesn’t exist.” So it’s “a proposal to do a counter-history of New York in which the parks in the park system never happened. It makes people understand the history of the city differently …” (These are all interesting variations on these theme of highlighting what is not there - what in other areas we might say are ‘making the invisible visible’ strategies - but here dealing with the creation of alternate pasts or alternate presents, in order to highlight the difference something has made. Either the difference from removal (pollution) or addition (parks).)

Gissen shows a map of Midtown Manhattan from the air, noting that

many innovations in cartography have happened around Manhattan (my friend Jack Schulze has just delivered one of the latest). Yet Gissen says that “no maps tell you what it feels like”. He means ‘feel’ as in environmental quality (rather than say accretion and dynamics of cultural capital, which is just as difficult to map in NYC, despite what some might claim.) In terms of the environmental quality of Manhattan, Gissen reckons that due to air conditioning “more indoor air was produced in this space than anywhere on Earth (until recently, possibly).”

Hence, an idea for an air conditioning map of Manhattan. Gissen’s interested in the work of Philippe Raum or Francois Roche - “or those that work with atmosphere”. “How does their work get preserved? You can’t photograph it. Why not have an archive in which the actual chemical content is preserved?”

Having started with Nietzsche, Gissen closes with an anecdote about his next door neighbour’s parrot. After many days of suffering the parrot’s incessant screeching, Gissen eventually realised that the parrot was imitating the sounds around their neighbourhood. It would squawk the buses’ brakes, or imitate the sound of washing up, and so on. Gissen sees the “parrot is a type of architectural and living archive” of his neighbourhood. He then speculates further, noting that some parrots might travel through war-zones, picking up the sounds they hear there. “So their song is of urban and social destruction,” he says. They end up in zoos, singing a song of distant conflict. He says when we look at “the way that non-human life can be used as an archive, we consider the way that the social, natural and historical can not easily be divided …”

In the Q&A, Geoff recalls the Duchamp piece 2cc of Paris air indicating it reminds him of Gissen’s approach, and wonders aloud about what the limits of preservation might be. Can we ‘preserve’ the Iraq war, or a traffic jam on the ‘10’ here in LA? Gissen says you can reconstruct anything - but preservation has its limits.

Page 4: prss release #26

With an eye to my day job, I ask Gissen about the overlap between ‘his kind of preservation’ and planning and design in urban development. Both are involved in projections of alternative cities. Gissen projects alternate cities in order to explore history, to assess the results of interventions ; in urban design, we project alternate cities in order to explore the future, constructing complex models attempting to predict the city’s behaviour as a result of a potential intervention. It’s very similar, technically.

Gissen appears to like this notion, and draws out its multidisciplinary implications - that historians might be involved with designers more directly, working on shared models, projections, analysis. He thinks that “a return to the confusion of the 18th century would be interesting, where history was not so far removed and historians would engage in (this kind of) projection too”. (I wholeheartedly concur. We should draw ever more disciplines into design practice, particularly around modelling which can otherwise be extremely reductionistic.)

As Greg Smith points out to me afterwards, it’s great to see a historian draw. I suspect they might’ve done so more frequently once, but he’s right - it’s great to see this kind of creative historical provocation rendered in such imaginative and communicable ways. Gissen’s talk is full of possibilities, looking backwards and forwards.

2 Should We Abandon the “Uncreative Class”?

Next American Cityhttp://americancity.org/daily/entry/1601/Josh Leon | Tue, May 19th, 2009

What would you do if you were the mayor of Detroit? Right now entrepreneurial urbanists in Detroit and other rust belt cities are by necessity re-envisioning their urban milieus, trying to make them greener, more creative, more prosperous places. There are pockets of success here and there, but the scary part is that all of this re-imagining might not matter. Given the rate of industrial decline, it would seem that distant global forces are shaping urban landscapes in ways that are as stoppable for urban planners as the weather.

The late management expert Peter F. Drucker wrote a piece in Foreign Affairs called “The Changed World Economy” in 1986 that accurately foresaw the problems that cities are dealing with today. The manufacturing sector, he said, was becoming decoupled from blue collar employment. America may produce just as much, but it’s doing so with fewer workers. At the same time, he saw flows of global capital—investment money moving from one place to another—as replacing in importance the actual physical trade in goods. Put together, this means that job opportunities come and go, moving around the country (and globe) at unprecedented speed. I’m worried that this accelerated mobility of opportunity might be too much for most people to keep up with.

The real problem in the ever-mercurial global economy is how to manage cities whose roles in it could become unmarketable by next week. The urban theorist Richard Florida argued in the March Atlantic that the answer was to redesign our urban geography for increased mobility. That way people can keep up with changing job markets and members of Florida’s lionized “creative class” of white-collar professionals can find one another. The old system that encouraged home ownership should be jettisoned in favor of renting, which makes it easier for people to pick up and move without the time consuming agony of home selling. Cities should be more concentrated, less suburban, and more connected by public transit. I’m generally fine with those propositions, as are most urban planners. However, there are bigger issues at hand when we talk about enhancing mobility to accommodate the volatility of unleashed markets.

The inefficient suburban-centric development model of the past few decades isn’t the only reason why migration is so hard. When people migrate nationally or internationally for the (potentially false) promise of a better life, they leave behind important familial and communitarian networks of social support. Permanent communities become temporary residences of job seekers en route from one place to the next, and any sense of connection to place is lost. Relationships fall under strain as families separate out of necessity. So the system of creative capitalism can be painfully atomizing.

There are also brutal legal barriers that prevent mobility. While goods and capital can move between cities freely (and job opportunities with them), millions of people around the world have to keep their very presence a secret in order to avoid deportation or Byzantine detention. It’s ironic that, in the age of NAFTA, the governments of the U.S., Mexico and Canada are spending billions on high-tech surveillance to regulate who travels among them. In China’s coastal cities—the geographic centers of the country’s economic activity—rural migrants are a legally mandated underclass.

Finally, not everyone can aford to move and the poorest are left behind amidst urban blight and neglect. What do we do about the immobile? What do we do with cities that are net losers of the “creative class”? For this so-called creative brand of capitalism, the uncreative are someone else’s problem. As Florida says, “We need to be clear that ultimately, we can’t stop the decline of some places, and that we would be foolish to try.” I would say that this is not at all clear. There is an inherent inhumanity in leaving people and their cities in the dust. Besides, the cost of finding ways to get so-called obsolete classes of workers gainfully employed where they live is looking preferable to the social costs of managing huge ghost cities

Zsuzsanna Ilijin

Page 5: prss release #26

and permanent spatial inequality.

3Seeing Is Believing

Brand Avenuehttp://brandavenue.typepad.com/brand_ave-nue/2009/05/seeing-is-believing.htmlby XXX 05/13/2009

Trying to sell your model home? Forget aspiring actors. Throw away the plastic food. Why even bother when you can have the real thing?

The fragrance of sage-scented candles and sounds of jazz fill the air of a 2,600-square-foot house a block from the beach. Tiger-striped chairs flank tables crafted from exotic woods. Photos of a chubby baby hang on the walls. Whoever occupies 211 Windward Way, they seem to live the good life.

Too good to be true, in fact. The house is owned by a builder, who hasn’t been able to sell it for more than a year. And while someone really does live here, it’s as part of an elaborate bit of stagecraft aimed at moving Southern California’s echoing inventory of luxury vacant homes. This $1.2 million seaside pied-a-terre is occupied by Johnna Clavin, a 45-year-old Los Angeles event planner and decorator who has seen business slow. In exchange for giving the townhouse a stylishly lived-in look, she gets to stay there at a steep discount and stands to earn a bonus if the house sells fast. “This is the perfect scenario for the times that we’re in,” she says.

It’s the logical evolution of home staging: living props, that actually live there. Life becomes theatre, which becomes sales technique. Amazing:

Ms. Clavin, and her furniture, beat out 46 applicants who auditioned for the homeowner role, says Quality First’s owner, Mary Heineke. “I already know they can’t afford the house,” Ms. Heineke says. “I want to know if they can replicate a person who can afford that house.”

Showhomes Management LLC, a franchise operation based in Nashville, has 350 “resident managers” living in homes for sale in 46 high-end markets, including in Florida, Arizona and Illinois. The company has seen revenues increase 88% since last year, says vice president Thomas Scott. Unoccupied staged houses aren’t selling as well as those with people in them, he says, “because people can still tell they’re vacant.”

(Architects spend day upon day imagining how people will inhabit the spaces they design, and doesn’t that make them especially perceptive potential “resident managers”? Just a thought.)

I believe this could be taken further still. I’d like to see a reality show with a battle between these resident managers as the premise; hire some folks and see whose sense of self and staging acumen closes the deal the fastest. In the meantime, we can follow them on Twitter (borrowing and twisting this premise), and ask ourselves whether we think their tweets--not just their thoughts about the home, but also their thoughts about the rhythms of their day-to-day, their hopes and fears, the ineffable stuff of (their) lives--resonate with us personally. I mean, when we bring real people and their stuff into the equation of selling a home, isn’t the resulting set of impressions inevitably more complex and therefore more compelling?

John Humphrey, 63, of Carlsbad, Calif., toured the property this month. He was taken in, imagining the owner as a wealthy “world traveler,” using it as a second home. He thought the owner was “maybe a Fortune 1000 vice president...45 to early 50s.” Told that the house was occupied by a woman who’d lived there less than a week, he was briefly flummoxed. “It reminds me of a movie,” he said. But he didn’t feel hoodwinked. “I’m impressed with somebody who can create that atmosphere,” he said. “No question I’d live there if I can get something else unloaded here in a hurry.”

More than homeownership, architecture, furniture, or even marketing, it’s our desires to step into the shoes of others that rise to the surface. Or if not to become them, it’s a desire to at least be close to them, to know them personally, to join them in their enviable lifestyle. Heady stuff.

460. Play Peter, the Pritzker Peddling Hermit Genius

NOTES ON BECOMING A FAMOUS ARCHITECThttp://famousarchitect.blogspot.com/2009/05/60-play-peter-pritzker-peddling-hermit.htmlby Conrad Newel

Some years ago, I used to go around thinking of Frank Gehry as a total schmuck. He went around making these wonky absurd things in god-awful materials and calling it architecture. His mantra seemed to be: lets do something totally stupid looking and ask why not? "I am so playful and whimsical" seemed to be the chant behind every spell he casts. The impression I had of him was that he made architecture look way too easy, you just put a newspaper or binoculars down on a model and voila! instant interesting architecture. And if anyone dares to say that they were annoyed

Page 6: prss release #26

by all this, they were labeled as narrow minded or too "hermetic" in their thinking. All of his detractors were put in this same box. His work, especially his early work was meant to be visually disturbing like his contemporary artist friends.

Then some years later, I reluctantly went into the Guggenheim in New York to see a retrospective of his work there. I came away surprisingly with a very different opinion of him and his work. Although I still did not care much for it, I came away with a lot of respect for him and what he was doing. For the first time I saw all the work that went into each project (or at least the story behind them). There were several different studies and experiments leading up to the final built thing. It was very different from the outward branding campaign that was publicly seen in all the media snippets of him telling an assistant to fold a piece of paper and then yeah! that looks so stupid I love it! The exhibition showed a different attitude, it showed an architect testing and trying out different things, looking at how to solve real problems, attentiveness to solving space, light, materials as any and every architect does.

What I realized was that I was so hypnotized by his branding machine that I was not able to tell the difference between the architect and the branding. The exhibition snapped me out of the spell and allowed me to see the difference.

It was then to my surprise some weeks ago when I learned that Peter Zumthor won the Pritzker prize that I thought to my self. "Oh that's so great, Zumthor is one of my favorite architects. I really like his work and I like his demeanor. He doesn't want publicity, he doesn't make an effort to publish his works in the glossy magazines, and finally the Pritzker prize committee selects such a person. It just goes to show, if you just work hard at what you do and don't worry about publicity or being famous you will be recognized, yada, yada, yada."

I heard my self and suddenly snapped out of it."Hey wait a minute!" I thought."just hold on one second.stop the music!That's not true!THAT'S NOT TRUE!!"

Peter is okay and everything but don't get hypnotized by his branding machine.

Oh...and the Pritzker committee selecting an obscure nobody?...hogwash!

Time for a reality check folks.

Every famous architect whether it is Frank Gehry playing Whimsical Wizard, Frank Lloyd Wright playing Egotistical Master, or Rem Koolhaas playing Intellectual Sheep, all have a branding game-plan that is strategically aimed at getting fame and recognition. Peter Zumthor playing Hermit Genius is no different.

Fame and recognition does not just happen! you have to work at it! A million dollars will not just fall into your lap if you go to work everyday and

do a good job without a plan of how to make it happen, and neither will a Pritzker prize fall into your lap if you just go off deep into the mountains and make good architecture unbeknown to anyone. Fame and recognition like any other career path must be carefully cultivated. Its like the old computer adage "garbage in garbage out": The results that you get are based on what you put in to it.

* If you put your energy into making good buildings, sooner or later you will realize a good building.* If you put your energy into getting famous, sooner or later you will be famous.* If you put some energy into making good buildings and some energy into making your self famous, sooner or later you will be a famous architect. It requires a dual effort.* If you find a famous architect who put most of his energy into becoming famous, it will be plain to see that he is famous for being famous. I am sure this is not what most people want.* If you make a balanced effort in both areas, as Zumthor has done, you will see that too.

Page 7: prss release #26

Besides doing good work, making interesting and/or quality architecture, putting an effort into being famous and getting recognition means publishing, writing, branding, going on the lecture circuit, building symbolic capital, schmoozing, and basically doing whatever you can to be visible in a positive way. The last time I checked Zumthor had close to a dozen books published in several languages, here is a list, just to name a few: Thinking Architecture, Peter Zumthor Therme Vals, Peter Zumthor Works: Buildings Projects, Atmospheres: Architectural Environments - Surrounding Objects, Corps Sonore Suisse (Swiss Sound Box), Architecture in Vorarlberg, Three Concepts: Thermal Bath Vals‎, A+U Extra Edition: Peter Zumthor and this one titled just plain old "Peter Zumthor". This does not include the countless articles, and magazine publications (glossy & non-glossy) that he has personally written or consented to by providing materials (images of the works, press releases, interviews etc).

You will notice that this is not consistent with the part of his ingenious branding strategy/philosophy of "I don't believe in publishing images of the work because architecture must be experienced first hand". He publishes anyway, because he has to. This elaborate brand is shrouded around a hermitesqe-philosopher-monk like mystique: His official press release photos shows him clad in what looks like a priest's shirt minus the neck collar piece, unpretentious, his arms folded, a stoic glare behind a

Page 8: prss release #26

meticulously trimmed white beard and short militarily disciplined hair cut. His writing style is largely phenomenological and reads like a mythology storybook. He touts beliefs like "I am not a networker, I’m not a difficult star. I’m simply someone who wants to do good work". He is eager to talk about how small his firm is and how selective he is with accepting commissions: "I can’t be bought with money" is his attitude towards clients.

If everyone knew how calculated all of this is, they would be astounded. Not just astounded--it would unveil his mystique and wreck his brand.

At this point, I should wave a flaming disclaimer that I don't believe he is insincere for one moment. You have to believe in your mystique whole heartedly before anyone else can believe in you. It has to come from you and resonate with your core beliefs and who you are. That's the first rule in building a brand or mystique. It would appear that the humble-one has deluded himself into believing that all this does not equal promoting himself and his work. Your job as an aspiring great-one, is to not buy into this delusion. Be aware that it is a brand, and that it is part of an elaborate, premeditated, well managed promotional strategy.

Besides the publications and the mystique-branding, he has certainly not shied away from the lecture circuit: Just google "peter+zumthor+lectures" and you will see a"zumthor wuz here" list of places far and wide where this globe trotting mountain hermit has been speaking ( ie. self promoting). If you are not an incredible networker, you can not pull this off. But lets say you are not an incredible networker and you somehow managed to pull this off, you would have to be either brain-dead or extremely socially repulsive not to come away with a network of friends and contacts that reads like a who-is-who-list in the world of architecture and beyond.

If you can ever find a copy of his resume, you will also see that he has built a treasure chest of symbolic capital: connections, awards, prestigious teaching positions, etc. He has taught at renowned schools from SCI-ARC in Los Angeles to the Graduate School of Design at Harvard University, (far away from the obscure mountains of Switzerland), where he has no doubt rubbed shoulders and schmoozed with the famous and well connected (even some of the Jurors of the Pritzker prize committee).

I am sorry, I just don't believe in random luck. There is a saying "the harder you work the luckier you get". Zumthor certainly worked hard at his luck and now it has paid off. He is a brilliant networker, brander, and self-promoter whether he sees himself that way or not! Three cheers to Mr. Zumthor for a Pritzker well earned. Don't envy him, emulate him!

As for the Pritzker prize committee, they have historically given the prize to well established starchitects: Heavyweights in the field who have largely branded themselves as stars. As we have seen in the previous post, the era of the starchitect is over. The committee finds itself in an awkward position. On the one hand they are way too embarrassed to bestow the award on someone who overtly brands himself as starchitect, and on the other hand they are way too parochial to pick someone outside the establishment. So why not choose a starchitect who brands himself as "I am totally not interested in being a starchitect" What other choice did they have? I am really curious to see who the other candidates were.

In the coming days you will see many articles published about the

Pritzker prize winner where he will be lauded for operating outside the establishment in a tiny remote village in the Swiss mountains, far removed from the international architecture scene. They will write about how he eschews the publicity and the promotions. They will describe him as the son of a lowly carpenter. Upon being told that he will receive the prize he made this statement:

That a body of work as small as ours is recognized in the professional world makes us feel proud and should give much hope to young professionals that if they strive for quality in their work it might become visible without any special promotion.

Without any special promotion? huh? When you look at the avalanche of these disingenuous statements, a implore you to resist gazing at the swinging pendulum of this hypnotic branding machine. I offer this statement to young professionals instead:

You won't get wise with the sleep still in your eyes, no matter what your dreams might be.

I am happy for Peter Zumthor, I wish him well, he is a good architect, as I said before, I like his work very much, but please don't insult me with the "I just make good work in tiny Swiss mountain" story line. Success leaves clues, and there are there to see, right before our eyes if we only open them and look.

Good work + Good Promotion = Fame & Recognition.

IN B

EIN

G A

STA

RCH

ITEC

T

I AM

TOTALLY N

OT INTERESTED

Annemarie van den Berg

Page 9: prss release #26

5Measuring the Immeasurable

Near Future Laboratoryhttp://www.nearfuturelaboratory.com/2009/05/18/mea-suring-the-immeasurable/

Good, Fast & Cheap, a measure of things. Designed by Rhys Newman, modeled

& machined by Simon James. A useful epistemological wrench — a conversation

piece to discuss the measures of things and the things lost and gained when some

things gain priority over others. It’s a wonderful, crucial instrument that shifts

perspectives hopefully towards more habitable creations.

Everybody needs Money. That’s why they call it Money.(From “Heist” by David Mamet. Danny DeVito playing Mickey Bergman.)

In the Laboratory’s Bureau of Instrumentation, Weights-and-Measures and Ways-and-Means, we’ve been curious for a time, and more so recently, about the history of quantification and, as well, why numbers as such have a kind of primacy over other things that are more qualitative. Most specifically, why do we measure the things we measure, and why do some things get to be measured while other things do not get measured, or are seen as immeasurable?

This question is a thorough-going one in the effort to find other measures that can be prioritized, perhaps even more so than the things we consider without even thinking about where these “natural” (they never are..) measures come from. For example, we measure things designed based on such things as their monetary cost, and how much profit can be obtained. With this measure, to simplify things, many principles that would be invested in a design get tossed out. The accountant or the engineer would sooner shrug in such a circumstance — this is the way it should be. I want to consider the “natural” way of such things, and consider how other sorts of measures can be prioritized that are not necessarily about money first, but always first about creating more habitable future worlds. What are the other measures of things that maybe previously have been thought of as “immeasurable” or incapable of being quantified? Thus, this interest in how things got to be the way they are. What are the measures of quantities and where did they come from? How could they be different? What things can be designed/made/prototyped the experiment with other measures?

There must be a variety of histories here, skirting up against the science of calculation and computation and close to the Laboratory’s interests in the history of things, such as sciences, design, technology and always deeply imbricated and layered and inextricably tied to all of these things — the histories of cultures.

Where to start?

Our Studio Library Day reading book last month was Alfred W. Crosby’s intriguing “The Measure of Reality: Quantification in Western Europe, 1250-1600” which I’m happy Manuel Lima mentioned at his talk during SHiFT 2008 in Lisbon. It took a holiday to actually finish the book, which lead me in a zillion other vectors and converged in a recent interest in the meaning and technology of money. It now takes me early mornings when fresh and not muddle-headed to re-read it for the over-arcing traces of Crosby’s perspective.

I’ll have my overdue book report on “The Measure of Reality”, but first a short trek down the footnote rabbit hole to Joel Kaye’s essay “The impact of money on the development of fourteenth-century scientific thought” found in the Journal of Medieval History 14(1988), p. 251-270. (Wonderful these academic essays tucked away in journals nearly impossible for anyone except academics to get a hold of, truly. Sadly, the availability of this essay is quite limited unless you have an “in” at a university or such, or a particularly flush public library system somewhere. So much for the academician’s edict to create and circulate knowledge.)

*Sigh*

No matter. Onward…

Good, Fast & Cheap, a measure of things. Designed by Rhys Newman, modeled &

machined by Simon James.

I came across the Kaye essay during a passage in “The Measure of Reality” where a point is made that the medieval brainiac Nicole Oresme (1323 -1382) wished to make the case, as he was advisor to several Kings of France, that the debasement of coin did not serve the public good. In other words, the artificial creation of inflation was not a good thing, according to Oresme. He saw money as part of the public commons and its manipulation in value was controversial. But, the royals controlled the minting of coin — early days Federal, or Monarchical Reserve Bank of a sort. A prototype central bank, controlled by the monarch. If the King needed to buy more broad swords, arrows, horses and men for the various wars run wild, stamping out more coins gave him more money. The downside is that the coins are debased in value, ultimately impoverishing the larger society.

(Parenthetically, inflation is one of those things that mystifies me. I cannot fathom how something becomes more expensive by small percentages over generations. Why was a cup of coffee once 5 cents and now is 5 dollars? Does it only cost more because prices are raised just…to raise prices? Where does it all start? If we say that the cost of a loaf of bread goes up, so the guy who grows the coffee beans is going to demand a little bit more annual income then I have to ask — why did the price of making a loaf of bread go up? Was it because the bread maker saw his price of a cup of coffee go up and, heck…he’s not going to go without his morning coffee, so he’ll charge a little bit more to those who buy his bread so he can afford that cup of coffee. But, wait..why did the price of the coffee

Page 10: prss release #26

go up?

Baffling. We’re waiting for your explanations to clear the “recursion level exceeded-202″ error on the Laboratory’s mainframe.

Is it just that someone somewhere in the hierarchy of money flows decides, like the royals of 14th century France, that they want a little bit more quantity of the stuff? Like..someone asks for a slightly larger salary, or charges a little bit more for that loaf of bread, and then the “consumer price index” goes up a bit and then things are inflated? I really want to know how this all works.)

The reference to this point of 14th century French inflationary “policies” was found on page 69 of “The Measure of Reality”, and points to the Kaye essay. So..I read it.

It’s actually quite fascinating. In it, Kaye (who apparently has written masterfully on this sort of thing, so that’s another rabbit hole) describes his take on the history of quantification — what are the ways that 14th century men (they were..) started measuring and numbering and quantifying things?

In the 14th century, there was a frenzy of quantification. Numerical rules and mathematical methods were applied to the solution of philosophical questions of all sorts. These were, literally, “The Calculators” — those men who involved themselves in the science of calculationes. “They were those whose habit it was to inquire into the quantitative or pseudoquantitative aspects of phenomena and processes. Anneliese Maier 1982:149)

At this time, things of quantity and quality were quite distinct, but the frenzy takes hold in a variety of ways. Natural philosophers and mathematicians search for “ways to measure and express mathematically the successive increases or decreases in the intensities of qualities” and thereby gradually transform things of quality to things of quantity through the efforts and activities to measure. In their science of calculating whatever they could get their paws on to measure, they, in these activities, began to transform what was a quality into a thing that could be measured in ratio to some fixed quantity. The effort itself of quantifying things brought about the milieu of quantification of things.

“The new possibilities of measurement pioneered by this school in the area of velocities, forces, resistances, and other aspects of motion, proved so exciting, and fell on such fertile intellectual soil, that soon not only things that had never been measured before, but things that have never been measured since, were subjected to a kind of quantititative analysis..

“Every ‘quality’ that could be imagined in some way as an intensive magnitude (that is, capable of increase or decrease), whether physical or mental in nature, was considered just as measurable as a patial magnitude.

“Such qualities as love, faith, race, the strength of the will, the power of the intellect and even the depth of religious vision were all seen as ‘divisible according to intensity’ and thus measurable in the same way as a physical quality such as heat.” (Kaye, 1988, p. 255)

It’s kinda crazy, but effectively the activity of measuring things leads to things that become measurable. Not everything took hold, and this is the point Kaye makes. While other historians have preferred to look to the smarty-pants natural philosophers, working toy-problems and in their solitary mode of pondering and so forth of the day — the academicians and thinkers and thought-experimentalists and such — Kaye turns his attention to larger societal activities. He looks at the monetization that is going on in the society at large, in the marketplaces, on the streets buzzing with trade.

Money is taking hold as a mechanism of exchange. It is becoming

the great pivot upon which exchange can happen because it balances out inequities and is divisible into appropriate quanta suitable to the satisfactory completion of individual trade.

It’s crazy to imagine from where we are today, but before money, trade was literally the trade of things, not an exchange of a thing with something fungible, like money. If I only had a, say, a plump, prolific farm animal with me to trade for something, chances were that I was not going to trade it for a mere handful of carrot seeds, or the pint of ale I thirsted for. And I couldn’t exactly say — look..I’ll give you this sow’s left hock for an ale. Sows, unlike the new money, were not divisible, at least not before the butchers, and a butchered sow can only be cooked and eaten, which means it won’t ever make little piglets, which is where its real value lies.

Good, Fast & Cheap, a measure of things. Designed by Rhys Newman, modeled &

machined by Simon James.

This period of monetization allowed everything — large and valuable, small and insignificant — to have a price; everything can be sold with this new instrument of exchange. It must have been a heady, crazy time in the marketplaces with this new instrument of exchange established.

“The money price of a commodity represented the extension of a common, objective system of measurement into what had formerly been (under barter) a realm of subjective valuation. The price of an object is, simply, the expresson of its value numerically, the quantification of its quality.”

Money becomes the unifying equalizer, whatever the quantity, between anything, arbitrarily. We learn more about what money was at this time through Kaye’s analysis of what Nicole Oresme is working on. Oresme is working on an extended tract on a number of topics related to money, including its origins, morality of money, religious connotations of usury, material characteristics of money — how much gold and silver it should contain, etc. Oresme sees money as an “instrument” — a mechanism that can overcome the subjectivity of value. It has been noted that he called money an instrumentum equivalens — “..a divisible scale that enabled goods of diverse value to be measured against each other and to find equality.”

What seems to happen is money becomes — Money, the thing-in-itself that can equate and quantify and balance. It is very much a 14th century technology that overcomes the inconveniences of the bartering process and the attendant problem of deciding and agreeing upon the value of goods and services to be bartered. It is an instrument of trade, an artifice invented to be a ruler that measures value. It created a kind of order upon the complex problems of valuation. It simplified, cleaned up and created a common measure.

With this monetization happening all around, and the simplifications to exchange, the cleaning up of the barter system, it is not surprising that other kinds of measures were sought after. If money could solve the

Page 11: prss release #26

problems of trade and exchange at the time, what other instruments of measurement and quantification could be found?

Therein developed a fascination with the possibilities of measurement and quantification and desires to bring a common system of measures to a whole variety of qualities — heat, whiteness, attraction, human friendship, joy and pain, intellect, religious vision, healing, hypnotism.

Lots of crazy measures of what we call qualitative things. This is what was happening in the marketplace with monetization. Very different things were “ordered” and disciplined into a new, common categorization. This is the the measuring technique of money, as an instrument. From shoes to salvation (as such was to come from indulgences from the Church), from a sack of pepper to an hour of time (which introduced a whole new episteme of value) — these were all brought into common measure by money.

Oresme’s work on the “latitude of forms” is an important contribution to the thinking on the quantification of qualities. He sketched drawings to show the variation of things, which was a dramatic simplification of the philosophy of the form of things. “Latitude of forms” refers to the breadth of a quality, as in its variability. (Variability with time need not be the reference, although it appears that many examples describing what Oresme was tinkering with here refer to things that one immediately considers varying in time.)

These geometrical drawings provided a simple visualization that revealed the “extension” of a quality versus its intensity at each point. Thus, for example, the speed of a cannon ball can be revealed with its speed being the height of lines over time. The process of performing these measurements effectively performs a kind of epistemological transformation, wrenching the “Form” of the thing (its qualities) into a quantifiable state, through the accumulation (adding-up) of the area under these graphs in order to produce new material for consideration.

The canonical example of a rock dropped from a suspended height comes to mind. It may once have been that it is the rock’s “Form” or quality or character to fall to the earth because it is of the earth and likely returns to its sphere of origin, which made perfectly good reasoning in the medieval. Now, the travel of the rock (and the cannon shot, and the arrow’s flight,

etc.) becomes something measurable in its “breadth” or extension (to which “latitude” refers), and whose travel is measured in time and various other factors (such as angle of projection, which was significant for obtaining varying distances). The “Form” of things can become quantified. Oresme’s geometrical sketches provide a tool for thinking and transforming the way one considers what a thing is. That is, we think of quantity quickly in many matters, largely because we were transformed into beings who saw quantity as a quick way toward describing what things were.

There are a variety of implications and conclusions proper to the Laboratory’s objective of following its curiosity.

First as pertains to money:

Money brought order into complex problems of valuation; simplifying, cleaning up, creating a common measure of value.

Fixed, absolute measures. Is it best if money remains fixed, but of course it does not. I believe this is something Oresme wanted to create — and is why he referred to money as part of the public commons. It is curious to Imagine other measures that are part of the “public commons” but do not remain fixed. What would a length measuring ruler that suffered inflation and recession look like? How would it change our perception of measures, quantities and such?

Money and the hidden mechanics, manipulation: The technological form of money – its mechanism – is hidden, unlike many other technologies of the time (wind mill, mechanical clock, etc.)

Money only works in ratio, relative to things amongst themselves. It is not a thing-in-itself; not an absolute measurement. It was a measurement that was meant to facilitate action — exchange — and make equalities.

And as pertains the quantity measure of thingsLatitude of forms applied to creating other scales: The latitude

of forms evolves to loose its sense of abstraction “from its early definition as the conceptual range of possible increase or decrease of a given quality, to its physical identification as the actual intensity of a quality at a given point.”

(* the latitude of forms transforms quality to its quanta; its a transforming “epistemological wrench” of sorts; a measuring tool that transforms the way something appears, or offers an opportunity to measure the form of something by investing quantities of some sort into it, perhaps in the medieval these were not first-and-foremost what the thing meant, or how it was received. *)

It was believed by Oresme that any quality subject to change in intensity could be measured; the line was the best instrument for doing this, in ratio.

(* correlation with the “quant” activities in financial markets? reliance on complex, indecipherable formula? *)

Why do I blog this? There is a worthwhile historical question here that points out that the measures of things have not always been as they are. That the measurements we take for granted as essential things came from somewhere, and were not passed down as “natural”, derived from the invisible nowhere before time. There is a history to the ways that some things get to count “more” than other things; that some things are quantifiable and other things are not.

Things that are quantifiable have not always been as they are — as quantifiable things. They become so, through the intensity of efforts put to them by those who are taught to prioritize counting things through whatever mechanisms they deem useful toward this goal, which are always social mechanisms, including spreadsheets, mathematics and formula. This is done to bring order to things that look as though they are not ordered, or in a mess. This kind of ordering must necessarily factor out certain characteristics that may in fact be desirable. In the effort to quantify, what is lost as a kind of excess that cannot be factored as a quantity?

Why would one be intrigued by this? Or, more to the point — why does the Near Future Laboratory’s Bureau of Instrumentation, Weights-and-

Page 12: prss release #26

Measures and Ways-and-Means care? We care because we believe that there is the time now to prioritize other measures of things, at least in many circumstances. The quantity theory of money, for example, may not best suit the creation of habitable futures, or it may even not suit the creation of material things which can obtain their value not in the amount of dollar bills they can be traded for, but in the quality of the experiences they create, whatever those experiences may be.

* Can design make decisions exclusive of the quantity of money theory?* Can an operation create things without quantifying its future in terms of margins of profit?* Can we look at a falling rock and marvel at its destiny, rather than the quantities measured in its velocity curve? Or fire, considering how it reaches for the sun?

6Fix the Pritzker

Architect Onlinehttp://www.architectmagazine.com/industry-news.asp?sectionID=1006&articleID=938850by Clay Risen on April 1, 2009

THIRTY YEARS AGO in May, Philip Johnson was awarded the first Pritzker Prize, an honor often called—need I say it?—the Nobel of architecture. But which Nobel? The prize for literature, a solitary pursuit? Or peace, which sometimes goes to individuals, but just as often to organizations?

It's an important question, because it gets to the heart of the problem plaguing the Pritzker. The prize, worth $100,000, is narrowly constructed to recognize the singular genius of the designing mind—so narrowly that in 1991 it went to Robert Venturi but not his partner Denise Scott Brown, with an explanation that the prize could only go to one person.

Presumably this glitch was fixed by 2001, when it went to Pierre Herzog and Jacques de Meuron. But the fact remains that by highlighting a single architect—rather than a team, or a building—the prize grossly distorts the reality of the architectural endeavor. As the country goes through all sorts of economic and social tumult, architecture should clean house, too, starting with the Pritzker.

Pritzker Prize winners Renzo Piano, Frank Gehry, Zaha Hadid, and Jean Nouvel

(left to right) consider a question from host Charlie Rose during a taping of The

Charlie Rose Show, held in Washington, D.C., in June 2008.

The prize's well-intentioned namesake, the late hotel mogul Jay Pritzker, believed, according to his son, “that a meaningful prize would encourage and stimulate not only a greater public awareness of buildings, but also would inspire greater creativity within the architectural profession.”

But what do those goals mean? In a way, architecture is perfectly, even banally, visible. We live in houses and work in office buildings. Granted, that's not what Pritzker meant. He meant architecture as a practice and an art—though more the latter than the former. No points for building a successful firm; what counts in the Pritzker race is aesthetic vision. Picasso was a great and famous painter, but Álvaro Siza, a great architect, is hardly known outside the profession. The Pritzker tries to rectify that (not that it always works—sorry, Sverre Fehn).

Is this the right way to look at architecture? According to the Pritzker (and its older cousins, the AIA Gold Medal and the Royal Institute of British Architects' Gold Medal), architectural excellence is one and the same as individual vision. But is it? Especially today, architecture is a collaborative undertaking; we maintain a collective illusion when we say the name on the front door is also the wellspring of a firm's ideas. Some firms recognize this, which is why we have UNStudio instead of van Berkel & Bos.

Moreover, by treating architectural excellence strictly as a question of vision, we make architecture an end in itself, like art, when it's just as much a means to an end. We don't just build structures to look at, but to live in and use, and that utilitarian function brings with it a slew of social and moral questions that a narrow focus on aesthetics so often avoids.

Clay Risen is the managing editor of Democracy: A Journal of Ideas. He has written about architecture for Metropolis, The New Republic, and Slate. He lives in Washington, D.C.

Yet the Pritzker—and, let's be honest, most of the profession—draws no distinction between luxury condos and homeless shelters. And because high-rent condo projects pay better, that's where the talent, and the recognition, go. Which is the real tragedy of the Pritzker. It's an award with a big name and a lot of money behind it, but instead of correcting for architecture's flaws, it reinforces them.

Compare it with three other prizes. The Aga Khan Award goes to buildings, not architects, and to qualify, a structure has to be at least three years old, so judges can evaluate how well it functions. The premium is on utility and strength as much as beauty.

Or consider the Vincent Scully Prize, which sometimes goes to architects but more often to teachers and activists and emphasizes scholarship, preservation, and advocacy. It places architecture within a social and moral context, and it recognizes that shaping the built environment is about more than crafting pretty objects. And there's the AIA Firm Award, which honors a firm's collaborative skills as much as its final products.

Unlike its 20th anniversary—accompanied by books, galas, and exhibits—things are pretty subdued on the Pritzker's 30th. The economy is down, architects are going jobless, and everyone is reflecting on why they got involved in the field in the first place. There's a real opportunity to reorient architecture toward more humane, socially engaged goals. Getting rid of the Pritzker—at least as we know it today—would be a good start.

Bouwe van der Molen

Page 13: prss release #26

7What’s A ‘Spooey’? A Field Guide To Freeway Interchanges, Part 1

Infrastructuristhttp://www.infrastructurist.com/2009/05/18/dont-pluck-the-cloverleaf-a-field-guide-to-highway-interchanges-part-1/by Jebediah Reed on May 18, 2009

Everybody knows what a cloverleaf looks like — but could you identify a volleyball, a double trumpet, or a “spooey” if you drove on one in the course of your highway travels? These are among the distinctive designs that transportation engineers have conjured up to keep traffic flowing and motorists headed in the right direction when major roads intersect.

For your driverly edification, we’ve compiled photo examples of more than 2o different kinds of strange and delightful highway interchanges found both here in the US and abroad. In fact, right now stimulus dollars are being spent t0 build or upgrade many interchanges into one of these forms.

The Turbine - A “free-flow” style of exchange like the cloverleaf — that is, no traffic signals or intersections. This example is in Florida, at the junction of I-75 and I-4:

The Cloverleaf - a classic, but it has fallen into some disfavor among traffic engineers because it causes weaving because cars are entering and exiting in the same lane. It also doesn’t handle large traffic volumes as well as some other configurations (for example, stacks).

The Stack - A vertically layered arrangement of highways and connecting elevated ramps. The number of levels varies and go as high as six (though three and four are more common). Stacks are expensive to build but very efficient for high traffic volumes. This example is in Shanghai, but there are many stacked interchanges in the US:

Nine Fluitsma

Page 14: prss release #26

The Lofthouse - A roundabout over two grade-separated highways. Less expensive than than a stacked interchange, but also has much lower capacity.

The ParClo - Or Partial Cloverleaf, a very popular design for places where interstates meet larger state and local roads. Depending how the loops and ramps are configured, a parclo is classified as either an A or a B and a number 1 through 4. This is an A-4:

The Butt - A highly gluteal variety of the parclo. This example is in Germany:

The Clovermill - A partial cloverleaf with turbine-style flyover (or, elevated) ramps:clovermill-germany

The Cloverstack - Combines elements of cloverleaf and stack designs. This rather feminine example is in Eastern Europe:

The Spaghetti Bowl - When we get into the realm of shapes and patterns that seem to be describable only by chaos theory or string theory. Spaghetti is a global phenomenon these days.

Classic Diamond - A simple and venerable design. It doesn’t eat up much land, but it can easily get backed up. This one is in Kentucky, near Louisville. (There is also a “Diverging Diamond” variant that involves driving on the “wrong” side of the road — for an example see this video showing one that’s being built in Utah.)

Page 15: prss release #26

The Spooey - The Single Point Urban Interchange (or, SPUI) is very compact and one of the best choices for tight spaces in cities. Unlike the diamond, it sends all traffic through one signal. The disadvantages of this arrangement are that it can be confusing to some drivers though, and it tends to be inhospitable to bikes and pedestrians. (See a visualization of a SPUI here.)

The Braid - This Maryland interchange is a stack design, but what’s unique about it is that the north and southbound segments of I-95 and east- and westbound segments of I-695 are actually braided over each other briefly in the middle of the interchange. (See a diagram here.)

8Incremental Housing Strategy in India / Filipe Balestra & Sara Göransson

Archdailyhttp://www.archdaily.com/21465/incremental-housing-strategy-in-india-filipe-balestra-sara-goransson/By David Basulto on May 8, 2009

Aerial collage: the new archipelago of incremented kaccha houses rising from a

context of well built permanent homes in a typical slum.

The problem with social housing has been how to give the most with less money. We have very good examples in Europe, but the constrains are way different than the ones in developing countries. In these countries, almost all the constructions are done by anyone but architects. Clearly, in these countries architects can do something way better than just designing or constructing, developing strategies together with communities to achieve housing solutions that not only address today´s necessities, but that can also be extended over time as families grow, once again by themselves and without architects.

Pierre Derks

Page 16: prss release #26

A good example on this is Elemental, lead by Alejandro Aravena, which has been changing not only design aspects of social housing, but also public policy. Currently, they have built and on going projects in Chile, Mexico and more countries.

But also, there´s the work that Filipe Balestra and Sara Göransson have been doing in India, invited by Sheela Patel and Jockin Arputham from SPARC to develop an Incremental Housing Strategy that could be implemented anywhere.

Both Filipe and Sara had a very interesting background for this kind of project: Filipe had previously designed and built a school and community centre in Rocinha, Rio de Janeiro’s largest slum, in a participatory design and construction process tother with the locals. The project was called Sambarchitecture and it was documented as a movie which was shown in Cinema Zita during Brazilian Film Festival in Stockholm. Sambarchitecture was also in exhibition in the Architecture Museum of Stockholm and in the Botkyrka Konsthall; Sara has been working on a strategy to connect Stockholm, framing the future urban development as urban bridges between segregated suburbs.

Informal office in Koregaon Park, Pune.

Design team:Filipe BalestraSara GöranssonGuilherme de BivarMartinho PittaRafael BalestraRemy TurquinCarolina Cantante

With SPARC and Mahila Milan

Soon after Filipe and Sara arrived to Bombay, a team of international architects, urban planners, landscape architects and graphic designers volunteered to set up the strategy which uses the existing urban formations as starting point for development. Organic patterns that have evolved during time are preserved and existing social networks are respected. Neighbors remain neighbors, local remains local.

The following pictures show the life inside of old temporary houses (kaccha). All photos were taken in Netaji Nagar, Pune, India in 2009.

When Filipe and Sara started working they did not know the Indian government would initiate a grant of 4500 euro/ family for the incrementation of their homes at a national scale. The grant is now active and it can be given to any family who lives in a kaccha - an old temporary structure, not suitable for living. It is called City In-Situ Rehabilitation Scheme for Urban Poor Staying in Slums in City of Pune Under BSUP, JNNURM. The strategy strengthens the informal and aims to accelerate the legalization of the homes of the urban poor. Their strategy was arranged to fit the parameters of this grant.

All proposals are for one family and 270 sq foot area (grant regulations). Also, each house will have a new individual toilet and kitchen. The existing

Page 17: prss release #26

houses do not have neither toilets nor kitchens. The government will provide new infrastructure which will be brought into every house.

Implementation collage: kaccha houses incremented and customized

Mixed cluster featuring houses: C-A-C-B-C-A. Families will share walls, columns

beams and infrastructure.

All prototypes need the participation of the community to emerge. The rules of the grant say each family has to contribute with 10% of the total max 4500 euro that the house costs. Since some families are not ready to give that amount so we are working on alternatives ways to contribute, i.e. sweat contribution: after the reinforced concrete structure is up, the families can help placing windows, doors, painting the house the color they want, and placing their own floor tiles. Thus, the families end up owning the process by customizing their homes.

Workshop in Netaji Nagar, Yerawada, Pune

Far left: Savita Sonawane from the Community Based Organization Mahila Mi-

lan, explaining strategy to slum dwellers of Netaji Nagar. Far right, Filipe Bal-

estra sketching possibilities.

The pilot project will be implemented in Pune, India. Filipe, Sara and SPARC are now spreading the word to implement the strategy in other countries with similar needs: Brazil, Kenya, South Africa, The Philippines - the list is long - 1/3 of the world’s urban population is now living in slums.

Each family is free to choose one of the 3 incremental prototypes:

House A: a 2 story house structured as a 3 story house, allowing the owner to

extend the house vertically without structural risks in the future.

House B: a 2 story house on pilots, allowing for the owner to either leave the space

open for parking or to increment it as a shop or an extra bedroom.

Page 18: prss release #26

House C: a 3 story house with a void in the middle. This void can be used like a

veranda, living or working space, and the family can close it in order to create a

new bedroom in the future.

9What comes first: the chicken or the egg? Pattern Formation Models in Biology, Music and Design.

the T-Machinehttp://the-t-machine.blogspot.com/2008/06/what-comes-first-chicken-or-egg-pattern.htmlby Katerina Tryfonidou & Dimitris Gourdoukis on June 3, 2009

The popular saying that wonders if the egg is coming before the chicken or vice versa, implies a vicious circle where all the elements are known to us and the one is just succeeding the other in a totally predictable way. In this article we will argue, using arguments from fields as diverse as experimental music and molecular biology, that development in architecture, with the help of computation, can escape such a repetitive motif. On the contrary, by employing stochastic processes and systems of self organization each new step can be a step into the unknown where predictability gives its place to unpredictability and controlled randomness.

01. Music

The Greek music composer and architect Iannis Xenakis in his book Formalized Music [1] divides his works -or better the methods employed in order to produce his works- into two main categories: deterministic and indeterministic models. The two categories, deriving apparently from the mathematics, are referring to the involvement or not of randomness in the compositional process. As Xenakis himself explains, “in determinism the same cause always has the same effect. There’s no deviation, no exception. The opposite of this is that the effect is always different, the chain never repeats itself. In this case we reach absolute chance – that is, indeterminism”[2]. In other words a deterministic model does not include randomness and therefore it will always produce the same output for a given starting condition. Differential equations for example tend to be deterministic. On the other hand, indeterministic or stochastic processes involve randomness, and therefore will produce different outputs each time that the process is repeated, given the same starting condition. Brownian motion and marcov chains are some examples of such stochastic mathematical models. Xenakis’ compositional inventory includes processes from both categories[3].

As said above, the use of stochastic models in composition has as a result a process that produces a different outcome each time that it is repeated. For example, using Brownian motion[4] (see figure 01: particles generated using Brownian motion[5]) in order to create the glissandi of the strings, means that the glissandi are generated through a process that includes randomness, therefore if we try to generate them again we will get a different output. At the same time, all the different results of the process will share some common characteristics. With that in mind someone would expect that such a musical composition would vary –at least in some aspects- each time that it is performed. However that is not the case with Xenakis’ works. While he was employing stochastic processes for the generation of several parts of his scores, he was always “translating” his compositions using conventional musical notation, with such detail that he was leaving no space at all for the performer to improvise, or to approach the composition in a different way. In other words the generation of the score involves randomness to a great extent, but the score becomes finalized by the composer so that each time that it is performed it remains the same.

Figure 01: Brownian motion. Object-e architecture: space_sound. 2007.

What is maybe even more interesting is that Xenakis did compose scores that are different each time that they are performed. However, those scores usually are employing deterministic mathematical models, therefore models that do not include randomness. In those cases the situation is inverted: the generation of the score is deterministic, but the performance may vary.

An example of the last case is Duel, a composition that is based on game theory[6]. The composition is performed by two orchestras guided by two conductors, and is literary a game between the two that in the end yields a winner. Each conductor has to select for each move, one out of seven options that are predefined by the composer. A specific scoring system is established and the score of each orchestra depends on the choices of the two conductors[7]. The result of this process is that each time that the composition is performed, the outcome is different. Therefore, a deterministic system where there are seven specific and predefined elements is producing a result that varies in each performance of the score. To make things even more complicated, the seven predefined musical elements are composed by Xenakis with the use of stochastic processes. To summarize the structure of Duel: Xenakis generated seven different pieces using stochastic processes, therefore seven pieces that include randomness.

Page 19: prss release #26

However those pieces were finalized by the composer into a specific form. Then those pieces are given to the conductors that are free to choose one for each move of the performance. The choice of each conductor however is not random: “… it is [not] a case of improvised music, ‘aleatory’, to which I am absolutely opposed, for it represents among other things the total surrender of the composer. The Musical Game accords a certain liberty of choice to the two conductors but not to the instrumentalists; but this liberty is guided by the constraints of the Rules of the Game, and which must permit the music notated by the score to open out in almost unlimited multiplication.”[8] So the choices of each conductor are based upon the strategy that he follows in order to win the game, and consequently upon the choices of the second conductor. Therefore the final performance of the score is different each time.

Xenakis was quite specific with his decisions regarding the use of deterministic or indeterministic processes. In most cases he employs models from both categories for each composition. More importantly, while he names his music “stochastic music”, the stochastic part, the place where randomness occurs, is always internal to the process and totally controlled by the composer. The final result - the score that reaches the performer, or even more the listener – is always specific. Even in the cases where the outcome may vary, it still does so in a set of predefined solutions already predicted by the composer.

02. Life ScienceThe study of morphogenesis, one of the most complex and amazing

research topics of modern biology, aims to provide understanding on the processes that control the organized spatial distribution of cells during the embryonic development of an organism. In other words, how starting from a single fertilized egg, through cell division and specialization, the overall structure of the body anatomy is formed[9]. According to Deutsch and Dorman[10] there is a continuum of different approaches to the problem; on the one end there are theories of preformation and on the other end systems of self organization. The concept of preformation assumes that any form is preformed and static. Therefore any new form is always a result of a combination of the already existing forms. Taking a different approach, self –organization implies a de novo pattern formation that is dynamic and gets developed over time. Morphogenesis in the self-organization model depends on the interaction between the initial cells or units. Preformation is a top-to-bottom idea, while self-orgazination is a bottom-up system. In both cases, research uses computation as the necessary medium for the simulation of the biological processes. We will argue that according to the latest research in biology, morphogenesis can be approached as a process which involves both the notion of preformation as well as self-organization.

During morphogenesis, cells proliferate and specialize, i.e. they choose which subset of proteins to express. But how do cells know when to divide or where to specialize? How do cells know where to become skin or bone or how many fingers should they form? It turns out that the key to understanding morphogenesis is the way cells sense and respond to their environment.

Cells obtain information about their environment by using proteins embedded in their membrane to sense specific “message” proteins that are located around them. When such a “message” protein binds to a membrane protein, the cell “receives” the message and acts accordingly[11]. Therefore, during morphogenesis there is a constant interaction, through such “message” proteins, between each cell and its neighboring cells. This interaction helps cells to understand where in the body they are located, when should they divide and when do they need to specialize into some particular type of cell.

The above function, or better, sequence of functions, has been the focus of scientific research for the past two decades. Nowadays, molecular biology can accurately describe many steps of the reactions between proteins and how they are related to cell specialization[12]. It has been proved that these reactions follow specific physical laws which can be described by

mathematical models. For example, given a pair of proteins, the properties of the resulting interactions are known. Because of the physical laws that are being applied, the model of the function of cells is a deterministic one, since it is made of many elementary interactions between proteins that have well defined inputs and outputs. Going this notion a step further, one could argue that the function of the cells implies the idea of preformation, that is, from two predefined elements only one possible combination can occur. In a way, the deterministic rules that the reactions of the proteins follow can be seen as a model of preformation, where there is only one output given a specific input.

Although the reactions between the proteins inside the cell follow specific rules that have been defined, yet there is a great degree of unpredictability at the life and function of each cell. Why science cannot predict exactly which will be the next “moves” of the cells, thus controlling all the functions in a (human) body? Although the nature of the outcome of the interaction between two proteins has been studied and analyzed, it is not possible to define deterministically when and where this interaction will take place, since proteins move randomly in space and they can interact only if they come into proximity and under proper relative orientation. This is true for proteins inside and outside the cell. Furthermore, it is not possible to define the exact location of neighboring interacting cells, when each cell will sense the presence of a “message” protein, when and how much will the cell respond to this signal by secreting its own “message” proteins, and when its neighbors will sense this signal. Given the number and complexity of the functions in each cell, as well as the vast possibilities of interactions with its neighboring cells, the large number of processes that could potentially happen cannot be expressed by deterministic models.

Since there is, to a certain degree, randomness in cellular functions, science turned to stochastic models in order to explain them. That is, instead of deterministic mathematical models, scientists use models that incorporate probabilities, in order to include the large amount of possible actions. Brownian motion, for example, is the stochastic model that describes the movement of particles in fluids, and therefore is used to describe the movement of proteins inside the cell. Stochastic processes can describe the spatial and temporal distribution of interactions inside cells and between neighboring cells.

To understand the importance of the stochastic component of cell function, here is another example: even though monozygotic twins have exactly the same DNA, they look similar but not identical. If the cell-response system was purely deterministic, then babies that have the same DNA should look identical. Nevertheless, this kind of twins look very much alike, but they are not identical. The small differences in their physical appearance occurred because of the stochastic nature of protein motion and interaction during morphogenesis. Even though the initial information was the same, and even though the outcome of protein reactions follows deterministic rules, the exact location of cells and proteins can only be described in a stochastic mode.

The stochastic part of cellular functions could, at a different framework, be seen as a model of self-organization. For people outside of the scientific biological community the introduction of randomness at research seems particularly intriguing. Instead of a process of preformation, (specific aspects of cells function that can be described by deterministic models) in the self-organizational model, cell function results in something different that cannot be described by deterministic rules. Cell functions depend on the fact that each cell is part of a whole. Together with their neighbor- cells, they react to external stimuli and to a large extent define the characteristics of the whole, as part of a bottom-up process. Deutch and Dorman, focus on the absence of distinction between organizer and organized in self-organized systems: “In self-organized systems there is no dichotomy between the organizer and the organized. Such systems can be characterized by antagonistic competition between interaction and instability”[13]. To a certain extend, the cell functions acquire a self-organizational character, because the transformations depend on the interaction of the cells with each other.

Page 20: prss release #26

There are many examples like the above in molecular biology to make the point that both deterministic and stochastic processes are used to describe the phenomena of life. As an overall assumption, many of the microscopic phenomena in life science follows deterministic rules, but the macroscopic outcomes can be described only in a stochastic way. Following this thought we argue that models of preformation and self-organization, as described above, can exist simultaneously in a system. The case of the cell-cell interaction in general and in the morphogenesis in particular depicts the complex processes that occur and highlights which part of the processes suggest a deterministic, preformatted model, and part of it follows a stochastic model of self-organization.

03. DesignThe two cases we already examined – Xenakis work in musical

composition and the study of morphogenesis in molecular biology – are both dependant to a great extent on the same medium: computation. Xenakis used computer as the means to transform mathematical models into music almost from the beginning of his career. At the same time, it would be impossible for researchers today to study the extremely complex phenomena that are involved in the development of life without the use of the computer.

The use of the computer, of course, has also become one of the main driving forces behind design today. The encounter of computation with design happened rather late and in the beginning took the form of an exploration of the formal possibilities that software packages were offering. That initial –maybe “immature” but still experimental– approach soon gave its place to a widely generalized dominance of digital means on every aspect related to architecture: from design strategies to the construction industry.

However, using the computer in an architectural context does not necessarily mean that we are taking advantage of the opportunities and the power that computation has to offer. More often than not, the use of computers in architecture today serves the purpose of the “computerization” of already predefined processes and practices – aiming usually to render them more efficient or less time consuming. That might be convenient, it does not promote however the invention of new ways to think about architecture. As Kostas Terzidis notes, “computerization is the act of entering, processing or storing information in a computer… [while] … computation is about the exploration of indeterminate, vague, unclear and often ill-defined processes”[14]. While automating and mechanizing every-day architectural tasks may be useful, the true gain for architecture in relation to digital media lies in the understanding of what a computational design process can really be. Only this way we can use computers in order to explore the ‘unknown’; in order to invent new architectures. We believe that the already mentioned examples in biology and the music of Xenakis can offer the means to better understand where these creative possibilities of computation are laying. Of course computation has numerous applications in several different fields, the selection of those two specific cases as guidelines however, has a very specific motivation: The biological approach is providing scientific, highly developed techniques that have been tested thoroughly and at the same time can show us how computation and digital tools can become the bridge between complex processes taking place in the physical world and the way that space is created. Xenakis’ work on the other hand, is an example of computational techniques used outside their strict scientific origins; that is in order to compose a musical score. Therefore they can provide insights about how those methods can be used in order to create an art-form.

The work of Xenakis is pointing out one of the most import aspects that computation is bringing into (architectural or musical) composition: the introduction of randomness. One can argue of course that architects always had to take decision based on chance. However humans are not really able of creating something totally random. For example if we ask somebody to draw a random line; between the decision to draw a line and the action of drawing the line, there are several different layers that affect the result: ones

idea of what a line is, ones’ idea of what random is, the interpretation of the phrase “draw a random line” etc. On the contrary computers are very good at producing randomness (see figure 02: stochastic algorithm positioning and scaling a box randomly). If we program a computer to draw a random line, then the computer will simply draw a line without anything external interfering between the command and the action. The ability to produce randomness, combined with the ability to perform complex calculations is defining the power of computational stochastic processes. And as we have already seen with the work of Xenakis, randomness can be controlled. Therefore the architect/programmer can specify specific rules or define the range within which the stochastic process will take place and then the computer will execute the process and produce results that satisfy the initial conditions. The advantage of such a process lays in the fact that through randomness the architect can be detached from any preconceptions that he or she may have about what the result should be, and therefore it will become easier to generate solutions initially unpredicted. By defining the rules and letting the computer generate the results, we open ourselves to a field of almost endless possibilities; designers can produce results that they couldn’t even imagine in the beginning of the process, while they can still maintain control of the process and the criteria that should be met.

Figure 02: Stochastic distribution. Object-e architecture: space_sound. 2007.

The most important aspect that is crucial to realize, and Xenakis’ work is making it easier to see, is that the power of algorithms used in composition lays in the process and not in the final result. If it is a building that we need to produce, then a building might be produced in the end, as a musical composition was produced in the end by Xenakis. However the importance is moving from the final result to the process that we use to generate it; the architect is not designing the building anymore, but the process that generates it. To point it out once again: computation is defining a process, not an output. And exactly because it allows us to focus on the process and not the results, those results can be unexpected.

Page 21: prss release #26

Figure 03: student: Josie Kressner

While Xenakis emphasizes the importance of process over the final output along with the stochastic properties of algorithms, the example from biology, when applied to architecture, is highlighting another important aspect that the use of algorithms is raising: that of self-organization. Architectural tradition, starting with the renaissance, is heavily based upon the idea of the “master”: an architect with a specific vision, which he or she materializes through his/her designs creating subsequently a style. Self-organization however implies a totally different idea: the architect does not actualize through his design something that he or she has already conceived. On the contrary: the architect creates the rules, specifies the parameters and runs the algorithm; the output is defined indirectly. Through computation and after many iterations, even the simplest rules can provide extremely complex results, which are usually unpredictable. Moreover, by a simple change in the rules something totally different may arise. The top-bottom idea of architecture, with the architect being at the top level and his/her creations being at the bottom, is inverted: the process begins from the bottom. Simple elements interact with each other locally, and through the iterative application of simple rules, complex patterns start to emerge. The architect does not design the object anymore, but is designing the system that will generate the final output.

An example of a pattern formation model with self-organizational properties is that of cellular automata, which are extensively in use in several different fields, and lately also in architecture. A cellular automaton is a self organized system where complex formations arise as a result of the interaction and the relations between the individual elements. The simplicity of the model combined with its abilities to produce very complex results and to simulate a very wide range of different phenomena makes it a very powerful tool that allows the architect to be disengaged from the creation of a specific output and to focus instead on the definition of a process. (see figure 04: a one dimensional ca and a surface generated byt the acceleration of the rules.)

Figure 04: student: Lauren Matrka

The possibilities arising for architecture are virtually infinite: from the creation of self-organized and self-sustained ecosystems to the study and planning of urban growth. In the place of externally imposed “sustainable” rules, we can have internally defined rules that form the generative process. In the place of applying external “planning” strategies to the cities, we can study urban entities as organisms, as system that are growing following specific rules that define the interaction between its elements.

Yet, as noted in the example of the protein interaction, self-organization is not encountered on its own. It is always functioning together with other, deterministic or pre-formed, systems. The same way that Xenakis was using indeterministic processes in relation to deterministic ones.

What stochastic processes and self organization are offering to architecture, are the means to engage the unknown, the unexpected. The

means to move away from any preconceptions that define what architecture is or should be, towards processes the explore what architectures can be. As Marcos Novak writes, “by placing an unexpected modifier x next to an entity y that is presumed – but perhaps only presumed – to be known, a creative instability is produced, asking, ‘how y can be x?’”[15]. In that sense, by placing models of self organization next to models of preformation, or stochastic processes next to deterministic processes, we are not only inventing new systems or new architectures, but we are also discovering new qualities of the systems that we already know – or we thought that we know.

[1] see Xenakis, I. Formalized Music: Thought and Mathematics in Composition New York: Pendragon Press, 1992.

[2] see Varga, B.A. Conversations with Iannis Xenakis London: Faber and Faber limited, 1996, p.76.

[3] In the “deterministic” category of Xenakis’ work fall compositions like Akrata, Nomos Alpha and Nomos Gamma, while examples of the “indeterministic” approach can be found in compositions like N’Shima (Brownian Motion) and Analologigues (Markov Chains).

[4] Brownian motion in mathematics (also Wiener process) is a continuous-time stochastic process. In physics it is used in order to describe the random movement of particles suspended in a liquid or gas.

[5] Figures 1-2: from the project space_sound, Object-e architecture, 2007. Figures 3-4: student work, School of Architecture, Washington University in St. Louis.

[6] Game theory is a branch of applied mathematics that studies the behavior in strategic situations, where an individual's success in making choices depends on the choices of others.

[7] For a detail description of Duel see Xenakis, I Formalized Music: Thought and Mathematics in Composition New York: Pendragon Press, 1992. p. 113 – 122.

[8] Xenakis, I. Letter to Witold Rowicki. see Matossian, N. Xenakis New York: Taplinger Publishing Co. 1986, pp. 164-165.

[9] http://en.wikipedia.org/wiki/Morphogenesis, 03/22/08

[10] see Deutch A. & Dorman S. Cellular Automaton; Modeling of Biological Pattern Formation, Boston: Birkhauser, 2005.

[11] see Sudava D., et al, Life: The science of Biology, Freeman Company Publishers, 2006.

[12] see Lodish H., et al, Molecular cell biology, Freeman Company Publishers, 2007.

[13] see Deutch A. & Dorman S., Cellular Automaton; Modeling of Biological Pattern Formation, Boston: Birkhauser, 2005.

[14] see Terzidis, K. Expressive Form, a Conceptual Approach to Computational Design New York: Spon Press, 2003, p.67.

[15] see Novak, M. “Speciation, Transvergence, Allogenesis: Notes on the Production of the Alien” AD vol 72 No 3 Reflexive Architecture, Spiller N. (ed.) London: Wiley Academy, 2002, p.65.

Page 22: prss release #26

10Organization and intelligence

Emergent Urbanismhttp://emergenturbanism.com/2009/06/01/organiza-tion-and-intelligence/by M Helie on June 1, 2009 1. Sun Tzu said: The control of a large force is the same principle as the control of a few men: it is merely a question of dividing up their numbers.

2. Fighting with a large army under your command is nowise different from fighting with a small one: it is merely a question of instituting signs and signals. - From The Art of War by Sun Tzu

The problem of social cooperation is how to order many individuals into large-scale patterns, and thus acquire the benefits of these larger patterns. The military arts were the first to face this problem, war being a field where inferiority carries severe consequences, and lessons are learned quickly. The solution was known in the time of Sun Tzu: the superior army was the one that could act as a single force, applying a single decision multiplied by however many men were at the command of this army. More men were always better, but past a certain scale it became unmanageable for a commander to yell out orders to everyone and maintain command. In order to resolve this the military men invented hierarchy, a command structure through which the commander’s orders would be distributed so that a group of any size could act as a single force.

For most of history success in war came from achieving and maintaining organization, lines of command from a center to the individuals that compose an army such that the commander could deploy the army in the most effective pattern he could think of. Discipline and complete obedience to orders was required, even if the situation as it appeared to the lowly grunt was in total contradiction to the orders he had been signaled. As far as he knew, the commander had a larger picture of the war and the orders ought to work out correctly. But the flaw in organization is that as an organization becomes larger, as the layers of hierarchy increase, the commander becomes more remote and more isolated from his army. The lines of communication become inefficient, the orders become irrelevant, and many men die stupidly.

Nevertheless, for centuries the sheer overwhelming force of numbers more than made up for the losses due to bad orders. The principle of organization triumphed. Reformers started looking for plans to organize industries, entire nations (the command economy of the Soviet Union), and of course, cities. The C.I.A.M. Athens Conference resulted in the publication in 1942, by Le Corbusier, of the Athens Charter, the document upon which the plans to organize modern cities, and be rid of the spontaneous historic city, were founded.

Between the time of the Athens Conference and the publication of the Athens Charter, the military concept of large-scale organization was completely discredited.

In June 1940 the German army invaded France. The two armies were evenly matched in men and weapons, France even having a advantage in tanks. Within one month the French army organization collapsed and millions of men surrendered without having put up much of a fight, resulting in many decades of American jokes about French surrender. In reality the two armies were far from evenly matched; the German generals had discovered a mean to overcome the weakness in the principle of organization, that it relied on a central, single commander. Their model of cooperation has been called Blitzkrieg, the lightning war, and its intent

was to reduce the delay in receiving and sending the “signs and signals” of command by removing them. German commanders out in the field were given broad directives and trusted to figure out on their own how to fulfill them, with glory and medals as reward for success. The French had instead refined organization and bureaucracy into a precise art. Within days of breaching into France, autonomous German tank divisions destroyed the lines of communication of the French army and paralyzed the front-line units. It became impossible for it to act as a single force, never mind stopping an invasion.

The German system of directive command was in fact the universal principle of emergence applied to military action. Instead of building a hierarchy of orders to communicate the will of a central commander, the armies were organized in parallel, directed to respond to their observed context, a context which was itself produced by other units of the same army. Instead of deploying the intelligence of a single commander holed up in an office in Berlin, the German system linked the intelligence of all of its officers into a more effective super-intelligence that could see all of the battlefield simultaneously. The collapse of the French army was therefore inevitable. It was a case of one against many.

As already mentioned, war teaches quickly, and the allies eventually adopted a similar operations model to fight the war to victory. German operations theorists went on to design the structure of NATO’s European defense, a war that we fortunately never witnessed. Urban planners did not have to learn this lesson, and they opted to organize cities to ruin.

The network structure is often, incorrectly, called a “bottom-up” organization. My opinion is that this label makes no sense. There is no up or down in a network. There is neither bottom nor top. Those are descriptions that apply to hierarchies only. In a network actions happen horizontally, in parallel. Large-scale patterns are made up of links between those local actions, as seen in the figure above. Human intelligence, for example, cannot be explained as a collection of cells. It is the patterns formed by the links between these cells that is intelligent, and it is these patterns that allow us humans to be several orders of magnitude more complex than individual cells.

The paralysis inflicted on the French army organization was in parts self-inflicted. Longer chains of command involved delays in transmitting information (reports from the field), analyzing the information, planning a reaction and ordering the new deployment. The bigger the army became, the more paralysis it suffered. This organization was in much the same situation as the dinosaur who did not feel a hit on his tail because the nerves were too far from his brain. The bigger it became, the more exposed it was to a paralysis-focused attack.

It should not come as a surprise that what caused the death of cities is also self-inflicted paralysis. But the case of cities is much more tragic. The German operations model was novel and innovative, a radical improvement in military art. Cities, however, had always been emergent. They were the product of a spontaneous order, a phenomenon that was barely understood at the height of rationalist planning. What science did understand was organization. Since it was accepted as the pinnacle of science, no rational thinker could reject the new urban planning. The planners did not notice the hints: what they were organizing had not been a creation of anyone.

Page 23: prss release #26

In a complex emergent system, the number of unique patterns scales up with the size of the system. (What some emergence commentators call “more is different,” another expression that makes no sense.) While an organization attempts to create a large-scale pattern to outmatch smaller patterns, a complex system is made up of both small and large patterns, in proportion to a power law, either nested together or juxtaposed randomly (a fractal). If an emergent system is intelligent, it will structure itself into patterns that no one had expected.

For centuries people had been accustomed to such patterns as the street of similar shopkeepers. Many streets in European cities bear the name of a particular trade, such as baker’s street or threadneedle street. But when cities passed a critical scale during the industrial revolution, a whole new pattern emerged: the central business district. An entire city within the city became the center of commerce, not simply specific streets next to residences. Although it appeared unexpectedly during the 19th century (the Haussmannian renovation of the Opera district of Paris was meant to create a neighborhood for the upper classes, but it became a business center immediately and has remained so ever since), a central business district came to be what a major city was all about. When planners set out to organize a modern city, they planned it around the CBD as the central feature. They did this by drawing a square on the map and applying a different set of rules to this square. Within a few years, their CBDs began dying. The small scale patterns nested within them had been zoned out.

In retrospect it was inevitable for an attempt at organization to severely interfere with urban processes, the principle of organization being a step down in complexity from the principle of emergence. Organization had a sinister advantage: it gave the planners the illusion that they could predict what the city was going to become. An emergent system cannot be predicted with precision. The very basis of its intelligence is that it has not yet been decided what it is going to do. Embracing an emergent system means accepting that patterns will appear that are beyond our comprehension. (In Wolfram’s terminology, the system is computationally equivalent to our own intelligence.)

By trusting their front line officers to run the war for themselves, the German general staff took a leap of faith that paid off decisively and confronted every opposing military with their crippling inferiority. I suspect the first modern city to give up on the principle of organization will trigger a similar revolution.

Sun Jung Hwang