Art Strike is Over

After the election of the Tyrant, a strange thing happened to me. I found my drive to make art and to write about  culture evaporate. This didn’t happened all at once and, for a time, I kept up a certain level of production, but I noticed my disdain for these activities rising. Somehow, producing for a failed culture seemed wrong. Instead, as I’ve chronicled here, I turned inward, taking on a variety of projects at our house, but also turning to radical gardening with native plants as a process of healing a small piece of land much damaged by human disturbance (notably construction of this property forty years ago) together with the violence of contemporary landscaping. It’s been incredibly valuable to practice radical gardening but it’s also a matter of temporal displacement: the work I have done won’t be visible for years to come: it’s an investment in a better future, optimism that this nightmare would end.

As for art, I continued research as a kind of secret activity, but I’ve definitely felt the need to retreat. At some point, I decided that I’d consciously go on an unannounced art strike while the Tyrant was in office. I couldn’t find a way to return to work that has been, at its basis, a sustained inquiry into how we behave to each other in the public realm.  I’ve talked to a few friends about this and apparently this isn’t uncommon at all.

Even if I don’t expect the political polarization crippling the US and the world to stop, with the Tyrant in his twilight, playing golf as performance art and flailing aimlessly about, there’s no question that a certain weight is gone. The world is still damaged, but it’s easier to sleep at night again, there’s room again to breath. If there’s another lockdown—and there should be—I’ll be here, taking notes and experimenting.

Writing and research are both in the works and you’ll be seeing evidence soon.

Architecture and Rage

This month of rage has been building for a while: the last few weeks, the last three years, are nothing new, neither for the country nor for myself. A tyrant sits on his throne, spreading his bigotry and inciting hatred. Black men and women are dying in the streets, killed by not so much by individual crooked cops as by a malignant system central to the disciplinary society that underlies our government.

Nothing new here. That’s not a way of ducking the issue, that is the issue. What follows is a highly personal account that I am set out to write to underscore just how deeply fucked up the discipline of architecture is.

Over thirty years ago, on August 6, 1988 I witnessed the peaceful Tompkins Square Protest against neighborhood gentrification turn into a police riot after cops charged the crowd wielding batons. Police claimed that bricks and bottles were thrown, but I was there, they lied. Cops lie, it’s a fact that my father—a conservative and fan of Ronald Reagan—taught me years before that. The cops charged us. We ran. As we inched our way back to the scene, I watched a young black man in dreadlocks being held by two cops on horses as they hit him over and over with their batons. I was fairly far away and I gauged the scene. I yelled at them and gestured obscenely, hoping they would chase me, giving the man some relief. I figured I’d duck into a storefront. Had they come after me, I am sure it would have been bloody, but they didn’t. They continued beating him. Later, the police would be charged with over a hundred counts of police brutality. Only two officers were actually found guilty of any wrongdoing and only one—conveniently for the NYPD, a woman—was dismissed.

I turned 21 that year and I was in New York because I was supposed to go to Columbia for graduate school in architecture, but after three days I’d had enough and I quit, going back to Cornell to do a doctorate in the history of architecture. But these two moments aren’t random bits of biography, the first led to the second. I’d majored in history of architecture for two years at Cornell and as part of that, I took design studio as well. As was the goal of that curriculum, design studio taught me more than anything else during those two years. It seemed to me that there was something fundamentally wrong with the formalist system of education as it was taught at Cornell, where I studied, as well as at so many others schools, including Columbia.

I tried to find out what was behind this system and, since the first project we worked on was the Nine Square Grid, I eventually traced it to a book that was curiously missing from Cornell’s Fine Arts Library and could only be found via interlibrary loan, the 1971 MoMA catalog, The Education of an Architect: a Point of View, which chronicled the teaching of John Hejduk at Cooper Union as well as the book that every student seemed to hide in their desks like a pornographic magazine, the 1975 Five Architects, which presented the work of Peter Eisenman, Michael Graves, Charles Gwathmey, John Hejduk, and Richard Meier. These books were to be read furtively since they suggested that reading, not learning through drawing, might be possible and that the system of education then being employed, which presented itself as timeless, might itself have a history.

I found it difficult, almost impossible, to square these books—filled with formal games and an architecture that defied materiality—with the real political revolution of the 1960s. How could they co-exist in the same milieu? But clearly, it was there, in his preface to Five Architects, Arthur Drexler, Director of Architecture and Design at MoMA, stated “An alternative to political romance is to be an architect, for those who actually have the necessary talent for architecture.”

In 1988, bound for architecture school, I witnessed the violence in the streets and a month later, decided that it was not the time for me to go to architecture school. I wasn’t just interested in understanding the way architecture repressed politics, I needed to understand it. Against all advice, I chose to try to understand the very educational system that I detested.

As I did so, I ran across two curious articles in, of all places, Spy Magazine. Spy, edited by Kurt Andersen, was this wild, satirical magazine that mercilessly targeted the celebrity class, most notably the “short-fingered vulgarian Donald Trump,” who remembered that slight well enough that he tweeted about it in 2015 when Charlie Hebdo‘s offices were attacked. The first was a 1991 article by John Brodie, “Master Philip and the Boys.” Brodie examined how Philip Johnson wielded power and influence in architecture and how he gathered “the Kids,” a boys’ club of formalist architects around himself (curiously enough, this included three of the five members of the New York Five plus Robert Stern and Frank Gehry). The second was the late Michael Sorkin’s 1988 article, “Where was Philip?” (thanks, Trump, you fucking fuck, for dragging your feet on effective measures against the COVID-19 virus and thereby killing Michael Sorkin, much better a man than you or your awful children will ever be). Using primary sources, Sorkin uncovered how the very same Philip Johnson tried to create a fascist party in the United States, going so far as to accompany the Nazis on the blitzkreig into Poland, singing the invading army’s praises in (what we now call fake) news articles for extreme Right-wing publications. This led to that again: I realized these three phenomena—a formalist architectural education, an aged architect who gathered power around himself, and the largely-supressed Nazi past of that architect—were not distinct, but were all aspects of the same system.

I spent three years writing a dissertation exploring how contemporary architecture was fundamentally based on a process of spectactularization—of stripping politics from history for the purposes of an illiberal politics of form. I was writing in the aftermath of the scandals about Paul de Man’s collaborationist writing and I was naïve enough to believe that the academy would take my work to heart and examine itself. Matters had already changed by the time I filed my dissertation when Franz Schulze’s published his biography of Johnson. I hadn’t known Schulze was writing the biography, or even doing research into Johnson’s fascist period until right before publication, so his work hardly impacted mine, apart from some last minute revisions. Schulze’s book was a bizarre and offensive combination of pinkwashing—Johnson loved the boys in their black boots and hot uniforms so it was all ok—and old school outing (Johnson was gay! he was gay, imagine the scandal!). Schulze, who demonstrated no interest in critical thought, either didn’t have the capacity to make the connection between the postwar whitewashing of Johnson’s Nazi past and the concomitant removal of both Left and Right politics from modern architecture or more likely, chose to ignore it. These were the same things: they both served power, privilege, and a Nietzschean drive to form at all costs. If Schulze didn’t draw this connection, I would soon find out that the academy didn’t want to hear it.

I published two articles from my dissertation in the Journal of Architectural Education. Editor Diane Ghirardo greatly supported this work, as did the peer reviews I received. The first article was “We Cannot Not Know History,” in which I examined on Johnson’s Nazi past and the attempts to repress it. This immediately got me in hot water with the JAE’s parent organization, the Association of Collegiate Schools of Architecture. Whether the directorate was interested in killing the article because—as they claimed—it opened the organization to libel lawsuits or whether they hoped to kill it because the article hit too close to home is up to you to decide. The second article was “The Education of the Innocent Eye.” My exploration of the formalist system of architectural education was damning as well—after all, I concluded, “An initial exposure to architecture through the absolute system of the visual language of architecture clearly puts materialist questions second. Beam, column, wall, compression, shear, and rotation take precedence, and history, theory, gender, race, and class take a backseat.” This time, quashing the article wasn’t in the cards and I won Best Article of the Year from the JAE.

But I’d already found that not only was architecture not willing to tackle the questions I’d raised, my dissertation topic was, on the contrary, an impediment to a career in the academy. Granted, it was the recession, but I spent two years unable to find employment until Margaret Crawford gave me a chance to teach history and theory of architecture at SCI_Arc. Even then, my work was received with suspicion by virtually all of the design faculty there and many of would have gladly curried favor with Johnson if given the opportunity. In the end, upon becoming Director, Eric Moss asked me in an accusatory tone what I was hoping to gain from writing this sort of thing. No great surprise, it was well known that he was a wannabe Kid. He made it clear I was unwelcome in his regime and his leadership style (so clearly prefiguring Trump’s presidency in its vulgarity and absolutism) were such that I didn’t want to work for him in any event.

In the meantime, I’d hoped to publish my dissertation in book form. Theory was rapidly falling out of fashion and there weren’t many takers for my work on architecture education. What still shocked me was how reluctant publishers were to publish my material on Johnson. Encouraged by some colleagues (notably Mike Davis), I put together a proposal to publish my work on Johnson together with the texts he wrote back in the 1930s. Unfortunately, these publishers had no interest in helping architecture confront itself. One asked, like Moss, “what I hoped to gain from all this.” (I started posting these articles in 2017 but the silence sapped my spirit…I’ll try again another day and see if there is any greater interest this time).

I taught as a visiting faculty member at the University of Pennsylvania for a term under the late, brilliant and gentle Detlef Mertins, but work there was only temporary. Clearly it wouldn’t have been possible for me to stay in the history of architecture—my work was too dangerous for a powerful figure in that program, friends confided in me (I suspected having a non-anglo-saxon name was also a problem for this fellow… as it has been in so many places throughout my life)—and that was the last time I was a full-time member of a history of architecture faculty in the US. I reinvented myself: if my dissertation had been around social networks, I now looked to theorize the impact of digital networks on cities and forged a career in that field. I managed to land a position in the laboratory research wing of the architecture school at Columbia that lasted well over a decade until it was eliminated by a new Dean seeking to cut costs. Even so, it was clear at Columbia that my presence as a historian was deeply questionable, not to be publicly acknowledged and I was never asked to be part of that faculty. Architecture still can’t stomach critique, this is clear to me.

I’m out of teaching, likely for good. I knew that this work was too dangerous for architecture and I had a backup plan for funding that paid off at just about the same time that the labs were shut down at Columbia so now I can work on other projects that interest me more. I’m much happier without the phony leftism of the university, of faculty who pretend to be Marxists, but whose real goal is defending their own turf and the system.

I have a lot of (political) gardening to do, I need to make some art, and look over a translation of one of my texts, but along the way, I intend to go back to some of these projects and post material from them on this site. Let’s see if there’s any interest this time. I’m not confident about it.

Taking a Break

I am taking a break from email and social media for a few days but more likely for a week. With all of the death around us and the moron in the White House, it’s just too much. If I owe you anything, sorry, I need to protect my mental health more.

The 2010s in Review (The Decade of Shit)

So this post is late, what do you expect from the 2010s? Ten years ago Time Magazine dubbed the 2000s the “worst decade ever,” but in retrospect that was such a carefree time, wasn’t it? Even the end-of-the-decade collapse seemed full of possibility, promising a truly cataclysmic civilizational implosion if nothing else.

In contrast, the 2010s weren’t just another failed decade, they were the decade of shit. All the hype and excitement led us to a universal dissatisfaction. Left, Right, and Center, we’re all pissed off about where we are and not enthusiastic at all about where we’re going. Even the proponents of doom are disappointed. The whole ZeroHedge crowd has been left trying to cover their short positions as the economy lurches onward and the doomers are facing expiration deadlines on their MREs as they wait for TEOFTWAWKI. Now sure, this year we’ve already had firestorms the size of Austria ravaging Australia, a rain of rockets in Baghdad, Ukrainian jetliners getting shot out of the sky, a deadly pandemic in China caused by people eating creatures that they really shouldn’t, and the failure of the Senate to uphold the rule of law, but the banality of it all is crushing. While the Dark Mountain set drinks wine around a campfire, gets henna tattoos, and sings along to songs about the end of nature, for the rest of us, it’s just an exhausting, daily slog through the unrelentingly alarming headlines.

We finish the decade with network culture in its last days. Back in 2010, when I was working in earnest on a book on network culture,* I made the following prediction:

“Toward the end of the decade, there will be signs of the end of network culture. It’ll have had a good run of 30 years: the length of one generation. It’s at that stage that everything solid will melt into air again, but just how, I have no idea.”

Well now we know how. All the giddy delirium about the network and globalization is gone. We’ve got our always-on links to the net in our hands all the time, we’ve got our digital economy, and with it all we’ve have entered a period of stark cultural decline. It’s an empty time, devoid of cultural monuments. Name one consequential building of this past decade, one!

Juoda Paduška, Valdas Ozarinskas

Well, there is this great project by architect Valdas Ozarinskas (who didn’t make it through the decade), a massive dark summation of our failures. But culture’s not doing well. The easy appeal to dopamine receptors provided by Facebook, Twitter, Instagram, Netflix, and YouTube has undone our ability to focus. This is the golden age of cat videos, nothing more. Any sustained thought is gone and bottom up efforts have dissipated.

A decade ago, my blogging comrades and I were plotting how to take over architectural discourse. But this amounted to nothing. Those blogs, and many others, have been silenced, absorbed into garbage sites like Medium, Facebook, Forbes, and BusinessInsider or just left in whatever state they were in, circa 2014, as their creators went on to Twitter, Facebook, Instagram, or just premature (or belated?) fin-de-siècle ennui. Podcasts are the one flourishing outpost of real DIY content on the Internet, perhaps because they distract from the world around us, but in all other respects the DIY ethic is on the wane. The most interesting publication of the 2000s, Make Magazine, went under in 2019, at least temporarily. Kickstarter is the metaphor for the decade: a lot of promises, a lot of crap that we’ve thrown away, a lot of outright lies, and a feeling of dread that somehow we’ll be sucked into its maw and participate again.

But its not just banality, as Bruce Sterling points out at his State of the World 2020 at the Well, we were always too optimistic about what bottom-up efforts could do. Network culture gave birth to the vilest of viral propaganda, some of it from state actors, some of it genuinely home grown. In Bruce’s words, “Our efforts had evolved an ecosystem for distribution of weaponized memes.”

Network culture didn’t usher in a new world of Free! Open Access! Networked Culture!, rather it ushered in the first phase of the Jackpot. That’s a phrase I used often these days, lifted from William Gibson’s 2014 The Peripheral, one of a handful of genuinely insightful cultural artifacts from the last decade (note that William Gibson’s Twitter name is @GreatDismal). The Jackpot refers—not so cheerily—to the end for some 80% of the world’s population, rich and poor, developed and undeveloped (largely the former in each pair).

Time for a lengthy quote from Gibson:

No comets crashing, nothing you could really call a nuclear war. Just everything else, tangled in the changing climate: droughts, water shortages, crop failures, honeybees gone like they almost were now, collapse of other keystone species, every last alpha predator gone, antibiotics doing even less than they already did, diseases that were never quite the one big pandemic but big enough to be historic events in themselves. And all of it around people: how people were, how many of them there were, how they’d changed things just by being there. …

But science … had been the wild card, the twist. With everything stumbling deeper into a ditch of shit, history itself become a slaughterhouse, science had started popping. Not all at once, no one big heroic thing, but there were cleaner, cheaper energy sources, more effective ways to get carbon out of the air, new drugs that did what antibiotics had done before…. Ways to print food that required much less in the way of actual food to begin with. So everything, however deeply fucked in general, was lit increasingly by the new, by things that made people blink and sit up, but then the rest of it would just go on, deeper into the ditch. A progress accompanied by constant violence, he said, by sufferings unimaginable. …

None of that … had necessarily been as bad for very rich people. The richest had gotten richer, there being fewer to own whatever there was. Constant crisis bad provided constant opportunity. … At the deepest point of everything going to shit, population radically reduced, the survivors saw less carbon being dumped into the system, with what was still being produced being eaten by those towers they’d built… And seeing that, for them, the survivors, was like seeing the bullet dodged..

Now amidst modernization, two World Wars, massive pollution, and unprecedented environmental cataclysms such as Minamata bay, Chernobyl, or Bhopal, the twentieth century was hardly a cakewalk and when it comes down to it the salinization of the Fertile Crescent is likely our real original sin (won’t someone start a green Church around this as the fall from grace?). But the Jackpot officially began on 9 November 2016, a day that reminded many of us of the outrage followed by the blank numbness that we experienced on 9/11. A new emergency was declared by never-Trump neoliberals, anti-Trump leftists, and, outraged by all the outrage, the pro-Trump neoreactionaries and neo-Nazis who had stocked up on guns in case HRC was elected and then had nothing to do. There would be no turning back now.

The horrifying truth underlying the Jackpot is that it isn’t just an accident. We used to think that global inequality was based on structural poverty, but now it’s becoming clear that automation will ensure that vast numbers of people are no longer needed. Couple that with climate change and you have the Jackpot. Entire swathes of the world—Afghanistan, Iraq, Syria, Libya, South Sudan, and Yemen—have already been rendered nearly uninhabitable by drought and continual war waged by the United States, Iran, Russia, and their proxies for strategic purposes. Trump’s America is full of individuals who have no future whatsoever and, with self-driving vehicles on the way, millions of truck and taxi drivers are about to find themselves as in demand as coal miners in West Virginia. Moreover, even as we continue to belch out carbon at an unprecedented rate, it’s only a matter of time before jobs in the fossil fuel and traditional automotive industries disappear as well. New jobs will appear, of course, but there will be far fewer of those and the idea of teaching coding to coalminers proved not to be that sound.

The newly disenfranchised have little to lose: the cracks are showing. It’s not that the containment can only last so long, it’s already breaking everywhere. We can see the collapse of the Paris Accords not just as a deliberate step further into dark acceleration, it’s a lizard-like reaction to the Jackpot, part of a new strategy, national government as survivalist retreat.

Sure, the Trump administration is largely composed of knuckle-dragging defectives, but we can discern a strategy if we look carefully enough: rather than making sacrifices to survive the Jackpot, they will do what they can to get the biggest piece of the pie for themselves and their cronies in the colossal redistribution of wealth it represents. And don’t get your hopes up about a socialist revolution in the next round of elections. My academic leftie friends impute way too much to old Frankfurt School notions of ideology: the reason Trump and his cohort of populists have been elected isn’t because the poor have been duped and only need to see the way, it’s because they know there’s no hope for them. There’s no standing in solidarity behind a neo-socialist boomer, they all know that’s not going to work and if they’re going out, they’re going to drag down the elites with them. Americans elected this grinning, Adderall-abusing droog and will most likely do so again, just like the rest of the populists rising to power worldwide. In the words of Joseph de Maistre’s (and Julie Mason’s) words, “every nation gets the government it deserves.” Who better than the “King of Debt” to show us that the Jackpot is here?

If anything, not only has the Left failed to come up with a convincing counter-argument, it’s gone down the entirely wrong route with identity politics, which has all but taken over not just Left politics but also the academy and museums. Identity politics isn’t-anti-Trump, it’s high Trump, embracing the idea that the Jackpot has started and the next step is to redivide the pot in favor of your tribe. In the art world and the academy, it teams with an exhausted neoliberalism looking for an alibi while it also helps sell culture to a new generation of oligarchs even as it further exacerbates the rampant tribalism in our society. Steve Bannon understood it well, if the Left fights on the basis of identity politics, his Right wing identity politics wins every time. But for many on the woke Left, this is hardly a problem: a Trumpian government gives their screaming more legitimacy and feeds their fevered dreams of revolution.

Against the rise of identity politics on the Left and Right, the Center is left floundering. Neoliberalism is exhausted, its most appropriate cultural manifestation being is overtourism. As the decade started, I repeatedly tried to launch a major research project on the phenomenon in the academy, but the project fell on deaf ears. I wasn’t surprised. Universities are just like travel: there is an appearance of diversity and difference, but it’s a generalized sameness, a gray nothing in which you won’t ever encounter anything new, just another Starbucks serving poor quality beans over-roasted beans so that you can’t tell what they are and some screaming about how special the place is.

My sense is that if there’s to be any kind of hope in the next decade to get out of what Bruce Sterling appropriately calls “the New Dark,” it’s going to be to achieve the impossible: throw out identity politics (left and right) and turn back toward a grand project—what academics used to criticize as a “metanarrative”—that most of us can get behind.

As this decade showed, there’s little question that this is climate change and toxins in the environment. Here’s where the Jackpot has its upside: the deaths of billions of humans pales in comparison to the species-cide we are undertaking to species left and right. Have you listened for the dawn chorus of birds lately? A hundred years from now the biggest news of this decade may be that this is when Rachel Carson’s Silent Spring became real as birds and pollinators died off in massive numbers. Here on the Northeast seaboard, we’ve bid goodbye to the ash tree and are watching for beech leaf disease, white pine needle disease, sudden oak death, and hoping the spotted lanternfly doesn’t cross the Delaware. It seems like the only place nature really thrives anymore isn’t in national parks, it’s in radiation exclusion zones.

But dead birds and trees don’t matter much to the average person who doesn’t have anywhere to shop besides the Dollar General Store and hasn’t seen fresh vegetables in years. The key is probably going to be luck, bad luck (and what is the Jackpot about if it’s not luck?). Maybe, just maybe if a series of truly awful major environmental cataclysms hit the key countries involved in carbon production—the US, China, India, Canada—they might, be alarmed enough to do something about it. We aren’t talking about a category 5 hurricane hitting New York City. Nobody will care about that, we are talking flattening a good portion of Florida and the Southeast, plus a good bit of Texas, maybe a good Dust Bowl 2.0 coupled with massive flooding in the Midwest then doing that five or six times over worldwide in the space of a couple of years. That’s a horrible, terrible thing, but if luck isn’t with us and it doesn’t happen, what are our chances of avoiding much worse conditions? And even if it does, will we be too late to turn back the clock?

*Never finished because a publisher botched the project to the point I quit working on it in disgust. But you can read an early essay (circa 2006) on network culture here.

My Dear Berlin Wall

This weekend marks the thirtieth anniversary of the fall of the Berlin Wall and, to commemorate, I thought it would be appropriate to post this chapter from Blue Monday, which Robert Sumrell and I published a decade ago as AUDC.

My Dear Berlin Wall

On June 17, 1979, Eija-Riita Eklöf, a Swedish woman, married the Berlin Wall at Groß-Ziethener Straße, taking Wall Winther Berliner-Mauer as her name. The final piece of heroic modernist architecture, the Berlin Wall was constructed just as the movement’s Utopian political ambitions had begun to wane. By then, with the eastward spread of modernism during the Khrushchev years,  the ideological distinctiveness of modernity had come to an end, making East-West, modern–antimodern harder to distinguish. Still, the architects of the Berlin Wall hoped it would change society.

For a while, it did just that. After the close of World War II, Berlin was a microcosm of Germany. Both city and country were cut into four occupation zones, each overseen by a commander-in-chief from one of the four Allied powers, the United States, the United Kingdom, France, and the Union of Soviet Socialist Republics. As the geopolitical tide continued to shift in the years after the war, tensions escalated between the Soviets and the Western allies. By the time of the Berlin Blockade of 1948-1949, it became clear that the three allied segments of Berlin would become “West Berlin,” an enclave of the Federal Republic of Germany (or “West Germany”) while the Soviet-controlled sector would become “East Berlin,” associated with the German Democratic Republic (or “East Germany”). Responding to the blockade crisis, Belgium, Canada, Denmark, France, Iceland, Italy, Luxembourg, Netherlands, Norway, Portugal, United Kingdom, and the United States promised mutual military defense through the North Atlantic Treaty Organization (NATO) in 1949. In 1955, West Germany joined NATO and, in response, the Soviet Union formed the Warsaw Pact with Albania, Bulgaria, Czechoslovakia, Hungary, Poland, and Romania. East Germany joined one year later as two events cemented the division of the world. The first was the Suez Crisis in which the United States, fearing thermonuclear war with the Soviet Union, forced France and Britain to withdraw from the Suez Canal, a critical piece of infrastructure they had occupied earlier that year to prevent Egypt from nationalizing it. The second was the Hungarian Revolution, crushed by Soviet tanks as the West watched. With the refusal of the Soviet Union and the United States to enter into direct confrontation, it became clear that the world, Europe, Germany, and Berlin had been divided into spheres of influence controlled by the superpowers.

The superpowers fought the Cold War through their display of achievements in science, propaganda, and accumulation. Berlin became the prime place for the two competing rivals to showcase their material culture. In the East, the Stalinallée, designed by architects Hartmann, Henselmann, Hopp, Leucht, Paulick and Souradny, was a nearly 2km long, 89m wide boulevard, lined with eight-story Socialist Classicist buildings. Inside these vast structures, workers enjoyed luxurious apartments, shops, and restaurants. In response, West Berlin held the Interbau exhibit in 1957, assembling the masters of modern architecture including Alvar Aalto, Walter Gropius, Le Corbusier, Oscar Niemeyer, Johannes van den Broek and Jaap Bakema, Egon Eiermann, and Pierre Vago to build a model modernist community of housing blocks in a park in the Hansaviertel quarter.

Nor was the battle of lifestyle between East and West limited to architecture. Soviet communism followed capitalism to focus on industrial productivity and expansion within a newly global market. While the United States exported its products—and eventually outsourced its production—throughout the world, the Soviets used COMECON, the economic equivalent of the Warsaw Pact, to eliminate trade barriers among communist countries. Each country produced objects to represent its superior quality of life, validating each system not only to its own citizens, but also to each other and to the rest of the world. The importance of material culture in the Cold War is underscored by the 1959 Nixon-Khrushchev “Kitchen Debate,” played out between the two powers in a demonstration kitchen at a model house during the American National Exhibit in Moscow. At the impromptu debate, Soviet Premier Nikita Khrushchev expressed his disgust at the heavily automated kitchen and asked if there was a machine that “puts food into the mouth and pushes it down.” U. S. Vice President Richard Nixon responded, “Would it not be better to compete in the relative merits of washing machines than in the strength of rockets? ” A Cold War battle was fought through house wares.(1)

The climax of the battle came in Berlin. In the divided city, half a million people would go back and forth each day between East and West. Westerners would shop in East Berlin where products, subsidized by the East Bloc, were cheap. Easterners would shop in the Western sector where fetish items such as seamless nylons and tropical fruits could be found. Overall, however, the flow was westward. The rate of defection was unstoppable: between 1949 and 1961 some 2.5 million East Germaners left for the West, primarily through Berlin. But just as destructive was the mass flight of subsidized objects westward, a migration that pushed the inefficient communist system to collapse. Khrushchev had won the debate: cheap goods from the East were more desirable, but the cost to the system was unacceptable. By this time, production and accumulation in the East Bloc had became goals in and of themselves, devoid of logic and use, no longer tied to the basic needs of the economy. In 1961, Khrushchev launched the Third Economic Program to increase the Soviet Union’s production of industrial goods at all costs. During a meeting that year, COMECON decided that the flow of people and products had to be stopped. On August 13, 1961 the border was sealed with a barrier constructed by East German troops.

The result was a city divided in two without regard for prior form or use. Over time, the Berlin Wall evolved from barbed wire fences (1961-1965) to a concrete wall (1965-1975), until it reached its full maturity in 1975. The final form would be not one but two Walls, each constructed from 45,000 3.6 meter high by 1.5 meter wide sections of reinforced concrete, each weighing 2.75 tons, separated by a no-man’s-land as wide as 91 meters. The Wall was capped by a smooth pipe, making it difficult to scale and was accompanied by fences, trenches, and barbed wire as well as over 300 watchtowers and thirty bunkers. (2)

With its Utopian aspiration to change society, the Wall was the last product of heroic modernism. It succeeded in changing society, but as with most modernist products, not in the way its builders intended. East Berlin, open to the larger body of the East Bloc, withered, becoming little more than a vacation wonderland for the Politburo élite of the Soviet Union and Warsaw Pact and a playground for spies of both camps. Cut off, West Berlin thrived. By providing a datum line for Berlin, the Wall gave meaning to the lives of its inhabitants. In recognition of this, Joseph Beuys proposed that the Wall should be made taller by 5cm for “aesthetic purposes.” (3)

As the Wall was being constructed in Berlin, Situationists in Paris and elsewhere were advocating for radical changes in cities as a means of preserving urban life. For them, the aesthetics of modernism and the forces of modernity were destroying urbanity itself. During this period working-class Paris was being emptied out, its inhabitants sent by the government to an artificial modernist city in the suburbs while an equally artificial cultural capital for tourists and industry was created in the cleaned-up center. Led by Guy Debord, the Situationists hoped to recapture the city by creating varied ambiances and environments and strategies providing opportunities for stimulation and chance drift. When Situationist architect Constant Nieuwenhuys deployed the floating

transparent layers of his New Babylon to augment the existing city with unique, flexible and transient spaces Debord condemned the project, arguing that the existing city was already almost perfect. Only minor modifications, were necessary, such as adding light switches to street lights so that they could be turned on and off at will and allowing people to wander in subways after they were shut off at night.

In his 1972 thesis at the Architectural Association, entitled “Exodus, or the Voluntary Prisoners of Architecture,” Rem Koolhaas found a way of reconciling modernism with Situationism through the figure of the Wall. Suggesting that the Wall might be exported to London and made to encircle it, Koolhaas writes, “The inhabitants of this architecture, those strong enough to love it, would become its Voluntary Prisoners, ecstatic in the freedom of their architectural confines.” Inside, life would be “a continuous state of ornamental frenzy and decorative delirium, an overdose of symbols.” Although officially proposing a way of making London more interesting, Koolhaas’s thesis is really a set of observations about the already existing condition of the real Wall. (4)

In choosing to encircle London with the Wall, Koolhaas recognized that it was not only the last great product of modernism, it was the last work of heavy architecture. Already in 1966, in his introduction to 40 Under 40, Robert Stern observed that an increasingly dematerialized “cardboard architecture” was “the order of the day” in the United States while in England, architects such as Archigram were proposing barrier-less technological utopias.(5)

Built of concrete, the wall was solid, weighty. It hearkened back to the days of the medieval city walls, which were not only defensive but attempted to organize and contain a world progressively more interconnected through communications and trade. Walls acted as concentrators, defining places in which early capitalism and urbanity could be found and intensifying both. So long as the modes of communication remained physical and the methods of making and trading goods were slow, nations retained their authority and autonomy through architectural solidity.

The destruction of the Berlin Wall in 1989 is concurrent with the pervasive and irreversible spread of Empire and the end of heavy architecture. Effectively, the Wall fell by accident. During 1989, mass demonstrations in East Germany led to the resignation of East German leader Erich Honecker. Soon, border restrictions between neighboring nations were lifted and the new government decided to allow East Berliners to apply for visas to visit West Germany. On November 9, 1989, East German Minister of Propaganda Günter Schabowski accidentally unleashed the Wall’s destruction. Shortly before a televised press conference, the Minister was handed a note outlining new travel regulations between East and West Berlin. Having recently returned from vacation, Schabowski did not have a good grasp on the enormity of the demonstrations in East Berlin or on the policy being outlined in the note. He decided to read the note aloud at the end of his speech, including a section stating that free travel would be allowed across the border. Not knowing how to properly answer questions as to when these new regulations would come into effect, he simply responded, “As far as I know effective immediately, right now.” Tens of thousands of  people crowded the checkpoints at  the Wall and demanded entry, overwhelming border guards. Unwilling to massacre the crowds, the guards yielded, effectively ending the Wall’s power. (6)

The Schabowski mis-speak points to the underlying reason for the Wall’s collapse: the lack of information flow in the East Bloc. The goals of Khrushchev’s Third Economic Program were finally met by the early 1980s, but by that point the United States was no longer interested in production. For America, the manufacturing of objects proved to be more lucrative when outsourced to the developing world. By sending its production overseas, America assured the success of its ideology in the global sphere while concentrating on the production of the virtual. Having spent itself on production of objects, as well as on more bluntly applied foreign aid, the Soviet Union collapsed. Manuel Castells observes that, “in the 1980s the Soviet Union produced substantially more than the US in a number of heavy industrial sectors: it produced 80 percent more steel, 78 percent more cement, 42 percent more oil, 55 percent more fertilizer, twice as much pig iron, and five times as many tractors.” The problem, he concludes, was that material culture no longer mattered. The PC revolution was completely counter to the Soviet Union’s centralization while photocopy machines were in the hands of the KGB. (7)

With the Wall gone, and with it the division between East and West undone, the world is permeated by flows of information. Physical borders no longer hold back the flow of an immaterial capital. Indeed, soon after the Wall fell, the European Union set about in earnest to do away with borders between individual nation states.

But through it all, the Wall has had no greater fan than Wall Winther Berliner-Mauer. In her view, the Wall allowed peace to be maintained between East and West. As soldiers looked over the Wall to the other side, they saw men just like themselves, with families they loved and wanted to protect. In this way, the Wall created a bond between men who would otherwise be faceless enemies. But Berliner-Mauer’s love for the Wall is far from abstract. For her, objects aren’t inert but rather can possess souls and become individuals to fall in love with. Berliner-Mauer sees the Wall as a noble being and says she is erotically attracted to his horizontal lines and sheer presence. While Berliner-Mauer is in a seemingly extreme, sexual relationship with the object of her desire, our animistic belief in objects is widespread across society. Adults and children alike confide in stuffed animals, yell at traffic signs, and stroke their sports cars all the time. Many people choose to love objects over their friends, their spouses, or themselves. Berliner-Mauer understands that the role of objects in our lives has changed. Objects are no longer tools that stand in to do work for us, or surrogates for normal human activities. Objects have come into their own, and as such we have formed new kinds of relationships with them that do not have a precedent in human interaction. The Wall is a loving presence in her life, and she loves it in return.

Faced with the tragedy of the Wall’s destruction, Berliner-Mauer created a technique she calls “Temporal Displacement” to fix her mind permanently in the period during which the Wall existed, even at the cost of creating new memories. (8) For Berliner-Mauer, marrying the Wall is a last means to preserve the clarity of modernism and the effectiveness of architecture as a solid and heavy means of organization. To love the Berlin Wall is to dream of holding onto the simple, clear disciplinary regime of order and punishment that came before we were all complicit within the system. In today’s society of control, as Gilles Deleuze writes, we don’t need enclosures any more. Instead, we live in a world of endless modulations, internalizing control in order to direct ourselves instead  of being directed.(9) After the fall of the Wall and the end of enclosures, even evil itself, Jean Baudrillard observes, loses its identity and spreads evenly through culture. An Other space is gone from the world. North Korea and Cuba are mere leftovers, relegated to tourism. There is nowhere to defect to: without the Wall, we must all face our own complicity in the system.(10)

(1) Elaine Tyler May, Homeward Bound: American Families in the Cold War Era (New York: Basic Books, 1999), 10-12.

(2) Thomas Flemming, The Berlin Wall: Division of a City (Berlin: be.bra Verlag, 2000), 75.

(3) Irving Sandler, Art of the Postmodern Era (Boulder, Colorado: Westview Press, 1998), 110.

(4) Rem Koolhaas, S, M, L, XL (New York: Monacelli, 1995), 2-21.

(5) Robert A.M. Stern, ed., 40 Under 40: An Exhibition of Young Talent in Architecture (New York: The Architectural League of New York, 1966), vii.

(6) Angela Stent, Russia and Germany Reborn: Unification, the Soviet Collapse, and the New Europe (Princeton: Princeton University Press, 1999), 94-96.

(7) Manuel Castells, End of Millennium, Second Edition (Oxford, UK: Blackwell, 2000), 26-27, See also the entire chapter on “The Crisis of Industrial Statism and the Collapse of the Soviet Union,” 5-67.

(8) https://www.berlinermauer.se (note from 2019: Eija-Riitta Eklöf-Berliner-Mauer died in 2015 and this site is now some kind of spam site)

(9) Gilles Deleuze, “Postscript on the Societies of Control,” October, Vol. 59 (Winter 1992), 3-7.

(10) Jean Baudrillard, “The Thawing of the East,” The Illusion of the End (Stanford, Stanford University Press, 1994), 28-33.

On Networked Publics and the Facebook

At a recent party, an acquaintance asked where I’d disappeared to. Nowhere, I said, and when I inquired into what prompted their question, they responded by saying that they couldn’t find me on the Facebook. I was a little surprised about this since I am hardly the first person to deactivate or delete an account; in the United States and the European Union, the Facebook’s membership has peaked and has been declining for a few years while engagement is in a steep decline.

Still, the Facebook has effectively replaced most social communication for many people, so I initially had some anxiety about deactivating my account. I thought it would be hard to go without the Facebook since it gathered together people from throughout my life and career and also hosted a few key discussion groups that I enjoyed (mainly pertaining to modular synthesis). Two months later, I don’t feel I am missing out on anything; on the contrary, I am more at ease. Losing the constant noise of the Facebook feed is a joy. If anyone actually wants to get in touch, I am here.

The connections we establish on Facebook are superficial. Birthdays are a case in point: it’s lovely when people remember your birthday, but they don’t really, the Facebook prompts them too. And a couple of happy birthdays doesn’t excuse the toxic impact the site has had on politics, which I find much more oppressive than Twitter, which is used by a large number of journalists and bloggers. My most active Twitter circle is composed of some of the best writers in the architecture and urbanism today as well as a smattering of people involved with digital culture. So, when I mentioned I was quitting Facebook, William Ball sent a link to a scary article by Timothy McLaughlin about how Facebook’s rise in Mynamar fueled genocide while Frank Pasquale lamented that Facebook might wind up like Big Tobacco, accepting the shrinking of its domestic market since it is doing so well overseas.        

This gets me to thinking. Something has changed in network culture: the 2016 election and the end of the Facebook’s growth in the US and EU are indicators that the heady exuberance about the Net has turned to anxiety and dismay. That the methods of communication used by networked publics are fundamentally flawed in a way that earlier publics were not isn’t just something that scholars talk about anymore, it’s something we all face, daily. Back when we took on the term as an object of study at the Annenberg Center for Communications, we understood “networked publics” to refer to a group of individuals with a particular interest in connecting together digitally. Some researchers—not coincidentally sponsored by technology corporations rather than universities—mistakenly depicted social network sites as networked publics in themselves. This is a category error: with few exceptions (the first incarnation of aSmallWorld, intended as a gathering place for the mega-rich and their hangers-on comes to mind), social networking sites act not as publics in themselves but rather as hosts or platforms on which, in theory, individuals could participate in a variety of different publics.

But they don’t. Algorithms ensure posts only reach individuals who “like” similar content, inhibiting the discussion and deliberation necessary to create a public, supplanting it with a brown slime of happy (or angry, or sad, depending on sorts of things one “likes”) posts. Social networking sites profit off of various malignant actors who promote content covertly. Controversy is good, creating more engagement, regardless of the human cost.

If publics can’t form on social networking sites and if, for many people, social networking sites are the primary means of interacting with others, is it possible that publics and public discourse are becoming extinct as a consequence? When I see people saying “it’s time for us to have a conversation” when they really want to shame you into accepting whatever idea they have adopted as their cause that day, I wonder if we have become completely unable to engage in actual public discourse. After all, the modern category of the public is less than four centuries old. Why should we be so bold as to think it will endure?

 

Drones at the New School

I am delighted to announce that I will be presenting Perkūnas at the New School tomorrow during the Sonic Pharmakon conference put together by Ed Keller for the Center for Transformational Media (I am a fellow with CTM this year). See the schedule here. As a modular synthesist and with a sound sensitivity disorder, I am committed to the possibilities of drone and am fascinated to see Sunday’s events (alas, I am on a plane back from Kauai today so will miss the day).

Aleksandra Kasuba, 1926-2019

Artist and designer Aleksandra Kasuba passed away on March 5. Kasuba was a brilliant force and I knew her throughout my life. Among my earliest memories is crawling around in her Lived-In Environment, a radical transformation of the first floor of the Upper Western Side brownstone that she and her husband owned through tensile constructions. My parents were friends with her and her husband, the sculptor Vytautas Kasuba and we would see each other periodically.

 

Although I lost touch with her after I went to graduate school and then moved to Los Angeles, we reconnected on the occasion of my giving a lecture on her work at the National Gallery of Art in Lithuania to accompany the exhibition of Spectrum: An Afterthought. During those 25 years she had kept tabs on me and agreed that I could give the lecture—apparently other historians had failed to understand her work properly—and I took copious notes on our discussions. Last week I wrote her obituary for the Architects’ Newspaper. See it here. I am revising my talk for the catalog of the retrospective of her work to take place there in 2020.

Year in Review 2018

The Year in Review 2018

I let six years go by without a Year in Review post, restarting the tradition last year. Not this time, although, with the frenetic pace of news this year, it seems like we have all aged six years in 2018.

Things are in a profound state of in-between. On the one hand, the Trumpian kleptocracy is accelerating. With Kelly and Mattis leaving in December, the “adult day care center” has closed, leaving only a pre-school version of Lord of the Flies in the White House. And yet, the end seems to draw near for this vexed time. Voters gave a resounding rebuke to Republicans in Congress, one that may ultimately be generational in nature and that gives Democrats subpoena power. Expect action soon. What’s in those tax returns? How much crony capital have Jared and Donald received over the years? By this time next year, we should know. Moreover, the Mueller investigation is accelerating, drawing closer and closer to the great kleptocrat’s inner circles even as we are left guessing at what sort of revelations we will learn in the months to come.

But that said, massive global instability is the price we pay for Trump. Authoritarian forces are on the rise throughout the world. It would be easy enough to say that these forces have been there all long, but its more accurate to say that the actions of individual players still matter. Trump was a colossal misfire, an eruption of senile admirers of fascism who think that a country of coal miners, machine guns in every classroom, and Christian sharia law will bring Jesus back, no doubt riding on a dinosaur. But with the markets on a rolled coaster ride that ultimately ended down in almost all sectors worldwide, we have to wonder how long business will find the radical Right palatable. Constant turmoil and increased tariffs are making CEOs wonder how useful Trump really is. It’s time to take gramps out of the White House and put him in a nursing home.

Beyond the rise of authoritarian power, 2018 was the year in which the rapid pace of climate change became obvious to anyone with a pulse. I am not a big fan of Alexandra Ocasio-Cortez (democratic socialism is a ticket to another right-wing victory), but her Green New Deal just makes sense. The US has spent trillions upon trillions subsidizing oil in various ways (from outright subsidies to the construction of roads which are, of course, paved in oil) and fighting wars in the Middle East to safeguard fossil matter, why shouldn’t we treat this as energy independence as matter of national security? There are 50,000 coal miners in the United States, less than the 89,000 employees of Sears who will lose their jobs this year’s and far less than the 1.6 million university faculty in the USve. If the Democrats want to win in 2020, running of a platform of stopping the rise in temperatures worldwide and the ballooning national debt while restoring basic rights and freedoms taken away during the Trumpic regime would be a good place to start (given that the GOP has forgotten about the deficit now).

As for architecture. What is there left to say about it anymore? Starchitecture has faded, nobody gets excited about cool forms anymore. How can we be surprised? No starchitect is making interesting buildings, in fact the whole movement has been something of a bust. Second, architecture is no longer the profession that shapes space, digital technology is. Failing to recognize this dooms the profession to irrelevance, like heraldry in the days of mustard gas.

But architecture isn’t the only institution without purpose. Silicon Valley, it seems, has finally met a time in which nobody cares about what it makes or promises. People are not only tired of big tech, they are tired of startups that promise the world when their only business plan is to be acquired as soon and possible. In fact, for all its promises,startup culture was a bust and it is far smaller than it was two decades ago. Apple made its best products ever (I am typing this on one of the amazing third generation iPad Pros that I bought), and was punished for it by a massive drop in its stock price.

If any tech became widely accepted by the mainstream in 2018, it was the Internet of Things and the Smart Home. Amazon’s Alexa, Nest and Ring’s video doorbell, and Lutron’s Caseta system were among the winners in this transformation of our interior lives. There is nothing terribly radical about the smart home and, frankly, a lot of the panic about surveillance with the hardware is silly (as if smart phones don’t already do this). But embedded technology is everywhere now.

Still, it’s odd how art (and architecture) misses this change. For want of anything else, we are still in the era of post-Internet art, an idea which, unfortunately, I am somewhat to blame for. If there was some merit to thinking about how network culture permeated art in 2011, talking about “post-Internet art” now simply is about as useful as talking about Abstract Expressionism as “post-automobile” art. Art, like architecture, has lost any purpose or drive forward. Technology and art have drifted apart again and only a few of us hack away at the intersection of the two. Still, art and architecture are always falling into ruin and being reborn. Perhaps this time will be no different and the work we are doing will lead to a rebirth?

The academy is sick as well. Years of poor management practices and bloated administrations have gutted the arts and humanities as faculty were forced to take on heavy teaching loads and real research has been eliminated (in case you wondered, I left Columbia when the new Dean did away with the entire research arm of the school to appease the finance office). Two decades ago, I decried “staff-ism” in schools, but now that is all that’s left.

I left teaching completely this year, resigning from my position at University of Limerick, Ireland after thirteen years and bringing nearly thirty years of teaching to end. In large part, it was the basic inability of universities to function that drove me away. What good is it for me to waste my time trying to jump through hoops to get paid when there are people in finance offices whose job literally is to ensure that faculty don’t get paid (I’ve been told this point blank)? And teaching itself isn’t much fun anymore. Students, for their part, are more interested in looking at their instagram feeds than in listening to what I have to say. It’s the opposite of the 1960s when students proclaimed the irrelevance of their teachers. Now, faculty proclaim the irrelevance of their students. Bah. It’s not worth it. It was a mistake to keep going over the last couple of years. I may come back to education one day—I have many great memories that come from my students and many of them remain my friends to this day—but now is a time when the university is very much irrelevant. Independence is what we need, not sick institutions.

Speaking of sick institutions, there is welcome news this year regarding Facebook: we saw the first signs of that hated enterprise starting to implode. Zuckerberg’s pathetic attempt to get a date by building a Web site has wound up doing tremendous damage to the Internet with its reduction of all content to a general level of idiocracy. Older forms of Internet communication such as blogs, email-mailing lists and Internet forums are dying and since nobody reads books or magazines anymore, we communicate less than we did thirty years ago. Instead, we don’t even get FarmVille, we get social diarrhea. Nobody likes Facebook. Independent voices are needed on the net again. It’s not up to someone else to provide them, it’s up to us.

I rebuilt my Web site last week in hopes of returning to being an independent voice in the field. I finished the last year in review with a similar resolution, maybe this year, I’m getting cranky enough that’ll actually happen.