The 2010s in Review (The Decade of Shit)

So this post is late, what do you expect from the 2010s? Ten years ago Time Magazine dubbed the 2000s the “worst decade ever,” but in retrospect that was such a carefree time, wasn’t it? Even the end-of-the-decade collapse seemed full of possibility, promising a truly cataclysmic civilizational implosion if nothing else.

In contrast, the 2010s weren’t just another failed decade, they were the decade of shit. All the hype and excitement led us to a universal dissatisfaction. Left, Right, and Center, we’re all pissed off about where we are and not enthusiastic at all about where we’re going. Even the proponents of doom are disappointed. The whole ZeroHedge crowd has been left trying to cover their short positions as the economy lurches onward and the doomers are facing expiration deadlines on their MREs as they wait for TEOFTWAWKI. Now sure, this year we’ve already had firestorms the size of Austria ravaging Australia, a rain of rockets in Baghdad, Ukrainian jetliners getting shot out of the sky, a deadly pandemic in China caused by people eating creatures that they really shouldn’t, and the failure of the Senate to uphold the rule of law, but the banality of it all is crushing. While the Dark Mountain set drinks wine around a campfire, gets henna tattoos, and sings along to songs about the end of nature, for the rest of us, it’s just an exhausting, daily slog through the unrelentingly alarming headlines.

We finish the decade with network culture in its last days. Back in 2010, when I was working in earnest on a book on network culture,* I made the following prediction:

“Toward the end of the decade, there will be signs of the end of network culture. It’ll have had a good run of 30 years: the length of one generation. It’s at that stage that everything solid will melt into air again, but just how, I have no idea.”

Well now we know how. All the giddy delirium about the network and globalization is gone. We’ve got our always-on links to the net in our hands all the time, we’ve got our digital economy, and with it all we’ve have entered a period of stark cultural decline. It’s an empty time, devoid of cultural monuments. Name one consequential building of this past decade, one!

Juoda Paduška, Valdas Ozarinskas

Well, there is this great project by architect Valdas Ozarinskas (who didn’t make it through the decade), a massive dark summation of our failures. But culture’s not doing well. The easy appeal to dopamine receptors provided by Facebook, Twitter, Instagram, Netflix, and YouTube has undone our ability to focus. This is the golden age of cat videos, nothing more. Any sustained thought is gone and bottom up efforts have dissipated.

A decade ago, my blogging comrades and I were plotting how to take over architectural discourse. But this amounted to nothing. Those blogs, and many others, have been silenced, absorbed into garbage sites like Medium, Facebook, Forbes, and BusinessInsider or just left in whatever state they were in, circa 2014, as their creators went on to Twitter, Facebook, Instagram, or just premature (or belated?) fin-de-siècle ennui. Podcasts are the one flourishing outpost of real DIY content on the Internet, perhaps because they distract from the world around us, but in all other respects the DIY ethic is on the wane. The most interesting publication of the 2000s, Make Magazine, went under in 2019, at least temporarily. Kickstarter is the metaphor for the decade: a lot of promises, a lot of crap that we’ve thrown away, a lot of outright lies, and a feeling of dread that somehow we’ll be sucked into its maw and participate again.

But its not just banality, as Bruce Sterling points out at his State of the World 2020 at the Well, we were always too optimistic about what bottom-up efforts could do. Network culture gave birth to the vilest of viral propaganda, some of it from state actors, some of it genuinely home grown. In Bruce’s words, “Our efforts had evolved an ecosystem for distribution of weaponized memes.”

Network culture didn’t usher in a new world of Free! Open Access! Networked Culture!, rather it ushered in the first phase of the Jackpot. That’s a phrase I used often these days, lifted from William Gibson’s 2014 The Peripheral, one of a handful of genuinely insightful cultural artifacts from the last decade (note that William Gibson’s Twitter name is @GreatDismal). The Jackpot refers—not so cheerily—to the end for some 80% of the world’s population, rich and poor, developed and undeveloped (largely the former in each pair).

Time for a lengthy quote from Gibson:

No comets crashing, nothing you could really call a nuclear war. Just everything else, tangled in the changing climate: droughts, water shortages, crop failures, honeybees gone like they almost were now, collapse of other keystone species, every last alpha predator gone, antibiotics doing even less than they already did, diseases that were never quite the one big pandemic but big enough to be historic events in themselves. And all of it around people: how people were, how many of them there were, how they’d changed things just by being there. …

But science … had been the wild card, the twist. With everything stumbling deeper into a ditch of shit, history itself become a slaughterhouse, science had started popping. Not all at once, no one big heroic thing, but there were cleaner, cheaper energy sources, more effective ways to get carbon out of the air, new drugs that did what antibiotics had done before…. Ways to print food that required much less in the way of actual food to begin with. So everything, however deeply fucked in general, was lit increasingly by the new, by things that made people blink and sit up, but then the rest of it would just go on, deeper into the ditch. A progress accompanied by constant violence, he said, by sufferings unimaginable. …

None of that … had necessarily been as bad for very rich people. The richest had gotten richer, there being fewer to own whatever there was. Constant crisis bad provided constant opportunity. … At the deepest point of everything going to shit, population radically reduced, the survivors saw less carbon being dumped into the system, with what was still being produced being eaten by those towers they’d built… And seeing that, for them, the survivors, was like seeing the bullet dodged..

Now amidst modernization, two World Wars, massive pollution, and unprecedented environmental cataclysms such as Minamata bay, Chernobyl, or Bhopal, the twentieth century was hardly a cakewalk and when it comes down to it the salinization of the Fertile Crescent is likely our real original sin (won’t someone start a green Church around this as the fall from grace?). But the Jackpot officially began on 9 November 2016, a day that reminded many of us of the outrage followed by the blank numbness that we experienced on 9/11. A new emergency was declared by never-Trump neoliberals, anti-Trump leftists, and, outraged by all the outrage, the pro-Trump neoreactionaries and neo-Nazis who had stocked up on guns in case HRC was elected and then had nothing to do. There would be no turning back now.

The horrifying truth underlying the Jackpot is that it isn’t just an accident. We used to think that global inequality was based on structural poverty, but now it’s becoming clear that automation will ensure that vast numbers of people are no longer needed. Couple that with climate change and you have the Jackpot. Entire swathes of the world—Afghanistan, Iraq, Syria, Libya, South Sudan, and Yemen—have already been rendered nearly uninhabitable by drought and continual war waged by the United States, Iran, Russia, and their proxies for strategic purposes. Trump’s America is full of individuals who have no future whatsoever and, with self-driving vehicles on the way, millions of truck and taxi drivers are about to find themselves as in demand as coal miners in West Virginia. Moreover, even as we continue to belch out carbon at an unprecedented rate, it’s only a matter of time before jobs in the fossil fuel and traditional automotive industries disappear as well. New jobs will appear, of course, but there will be far fewer of those and the idea of teaching coding to coalminers proved not to be that sound.

The newly disenfranchised have little to lose: the cracks are showing. It’s not that the containment can only last so long, it’s already breaking everywhere. We can see the collapse of the Paris Accords not just as a deliberate step further into dark acceleration, it’s a lizard-like reaction to the Jackpot, part of a new strategy, national government as survivalist retreat.

Sure, the Trump administration is largely composed of knuckle-dragging defectives, but we can discern a strategy if we look carefully enough: rather than making sacrifices to survive the Jackpot, they will do what they can to get the biggest piece of the pie for themselves and their cronies in the colossal redistribution of wealth it represents. And don’t get your hopes up about a socialist revolution in the next round of elections. My academic leftie friends impute way too much to old Frankfurt School notions of ideology: the reason Trump and his cohort of populists have been elected isn’t because the poor have been duped and only need to see the way, it’s because they know there’s no hope for them. There’s no standing in solidarity behind a neo-socialist boomer, they all know that’s not going to work and if they’re going out, they’re going to drag down the elites with them. Americans elected this grinning, Adderall-abusing droog and will most likely do so again, just like the rest of the populists rising to power worldwide. In the words of Joseph de Maistre’s (and Julie Mason’s) words, “every nation gets the government it deserves.” Who better than the “King of Debt” to show us that the Jackpot is here?

If anything, not only has the Left failed to come up with a convincing counter-argument, it’s gone down the entirely wrong route with identity politics, which has all but taken over not just Left politics but also the academy and museums. Identity politics isn’t-anti-Trump, it’s high Trump, embracing the idea that the Jackpot has started and the next step is to redivide the pot in favor of your tribe. In the art world and the academy, it teams with an exhausted neoliberalism looking for an alibi while it also helps sell culture to a new generation of oligarchs even as it further exacerbates the rampant tribalism in our society. Steve Bannon understood it well, if the Left fights on the basis of identity politics, his Right wing identity politics wins every time. But for many on the woke Left, this is hardly a problem: a Trumpian government gives their screaming more legitimacy and feeds their fevered dreams of revolution.

Against the rise of identity politics on the Left and Right, the Center is left floundering. Neoliberalism is exhausted, its most appropriate cultural manifestation being is overtourism. As the decade started, I repeatedly tried to launch a major research project on the phenomenon in the academy, but the project fell on deaf ears. I wasn’t surprised. Universities are just like travel: there is an appearance of diversity and difference, but it’s a generalized sameness, a gray nothing in which you won’t ever encounter anything new, just another Starbucks serving poor quality beans over-roasted beans so that you can’t tell what they are and some screaming about how special the place is.

My sense is that if there’s to be any kind of hope in the next decade to get out of what Bruce Sterling appropriately calls “the New Dark,” it’s going to be to achieve the impossible: throw out identity politics (left and right) and turn back toward a grand project—what academics used to criticize as a “metanarrative”—that most of us can get behind.

As this decade showed, there’s little question that this is climate change and toxins in the environment. Here’s where the Jackpot has its upside: the deaths of billions of humans pales in comparison to the species-cide we are undertaking to species left and right. Have you listened for the dawn chorus of birds lately? A hundred years from now the biggest news of this decade may be that this is when Rachel Carson’s Silent Spring became real as birds and pollinators died off in massive numbers. Here on the Northeast seaboard, we’ve bid goodbye to the ash tree and are watching for beech leaf disease, white pine needle disease, sudden oak death, and hoping the spotted lanternfly doesn’t cross the Delaware. It seems like the only place nature really thrives anymore isn’t in national parks, it’s in radiation exclusion zones.

But dead birds and trees don’t matter much to the average person who doesn’t have anywhere to shop besides the Dollar General Store and hasn’t seen fresh vegetables in years. The key is probably going to be luck, bad luck (and what is the Jackpot about if it’s not luck?). Maybe, just maybe if a series of truly awful major environmental cataclysms hit the key countries involved in carbon production—the US, China, India, Canada—they might, be alarmed enough to do something about it. We aren’t talking about a category 5 hurricane hitting New York City. Nobody will care about that, we are talking flattening a good portion of Florida and the Southeast, plus a good bit of Texas, maybe a good Dust Bowl 2.0 coupled with massive flooding in the Midwest then doing that five or six times over worldwide in the space of a couple of years. That’s a horrible, terrible thing, but if luck isn’t with us and it doesn’t happen, what are our chances of avoiding much worse conditions? And even if it does, will we be too late to turn back the clock?

*Never finished because a publisher botched the project to the point I quit working on it in disgust. But you can read an early essay (circa 2006) on network culture here.

My Dear Berlin Wall

This weekend marks the thirtieth anniversary of the fall of the Berlin Wall and, to commemorate, I thought it would be appropriate to post this chapter from Blue Monday, which Robert Sumrell and I published a decade ago as AUDC.

My Dear Berlin Wall

On June 17, 1979, Eija-Riita Eklöf, a Swedish woman, married the Berlin Wall at Groß-Ziethener Straße, taking Wall Winther Berliner-Mauer as her name. The final piece of heroic modernist architecture, the Berlin Wall was constructed just as the movement’s Utopian political ambitions had begun to wane. By then, with the eastward spread of modernism during the Khrushchev years,  the ideological distinctiveness of modernity had come to an end, making East-West, modern–antimodern harder to distinguish. Still, the architects of the Berlin Wall hoped it would change society.

For a while, it did just that. After the close of World War II, Berlin was a microcosm of Germany. Both city and country were cut into four occupation zones, each overseen by a commander-in-chief from one of the four Allied powers, the United States, the United Kingdom, France, and the Union of Soviet Socialist Republics. As the geopolitical tide continued to shift in the years after the war, tensions escalated between the Soviets and the Western allies. By the time of the Berlin Blockade of 1948-1949, it became clear that the three allied segments of Berlin would become “West Berlin,” an enclave of the Federal Republic of Germany (or “West Germany”) while the Soviet-controlled sector would become “East Berlin,” associated with the German Democratic Republic (or “East Germany”). Responding to the blockade crisis, Belgium, Canada, Denmark, France, Iceland, Italy, Luxembourg, Netherlands, Norway, Portugal, United Kingdom, and the United States promised mutual military defense through the North Atlantic Treaty Organization (NATO) in 1949. In 1955, West Germany joined NATO and, in response, the Soviet Union formed the Warsaw Pact with Albania, Bulgaria, Czechoslovakia, Hungary, Poland, and Romania. East Germany joined one year later as two events cemented the division of the world. The first was the Suez Crisis in which the United States, fearing thermonuclear war with the Soviet Union, forced France and Britain to withdraw from the Suez Canal, a critical piece of infrastructure they had occupied earlier that year to prevent Egypt from nationalizing it. The second was the Hungarian Revolution, crushed by Soviet tanks as the West watched. With the refusal of the Soviet Union and the United States to enter into direct confrontation, it became clear that the world, Europe, Germany, and Berlin had been divided into spheres of influence controlled by the superpowers.

The superpowers fought the Cold War through their display of achievements in science, propaganda, and accumulation. Berlin became the prime place for the two competing rivals to showcase their material culture. In the East, the Stalinallée, designed by architects Hartmann, Henselmann, Hopp, Leucht, Paulick and Souradny, was a nearly 2km long, 89m wide boulevard, lined with eight-story Socialist Classicist buildings. Inside these vast structures, workers enjoyed luxurious apartments, shops, and restaurants. In response, West Berlin held the Interbau exhibit in 1957, assembling the masters of modern architecture including Alvar Aalto, Walter Gropius, Le Corbusier, Oscar Niemeyer, Johannes van den Broek and Jaap Bakema, Egon Eiermann, and Pierre Vago to build a model modernist community of housing blocks in a park in the Hansaviertel quarter.

Nor was the battle of lifestyle between East and West limited to architecture. Soviet communism followed capitalism to focus on industrial productivity and expansion within a newly global market. While the United States exported its products—and eventually outsourced its production—throughout the world, the Soviets used COMECON, the economic equivalent of the Warsaw Pact, to eliminate trade barriers among communist countries. Each country produced objects to represent its superior quality of life, validating each system not only to its own citizens, but also to each other and to the rest of the world. The importance of material culture in the Cold War is underscored by the 1959 Nixon-Khrushchev “Kitchen Debate,” played out between the two powers in a demonstration kitchen at a model house during the American National Exhibit in Moscow. At the impromptu debate, Soviet Premier Nikita Khrushchev expressed his disgust at the heavily automated kitchen and asked if there was a machine that “puts food into the mouth and pushes it down.” U. S. Vice President Richard Nixon responded, “Would it not be better to compete in the relative merits of washing machines than in the strength of rockets? ” A Cold War battle was fought through house wares.(1)

The climax of the battle came in Berlin. In the divided city, half a million people would go back and forth each day between East and West. Westerners would shop in East Berlin where products, subsidized by the East Bloc, were cheap. Easterners would shop in the Western sector where fetish items such as seamless nylons and tropical fruits could be found. Overall, however, the flow was westward. The rate of defection was unstoppable: between 1949 and 1961 some 2.5 million East Germaners left for the West, primarily through Berlin. But just as destructive was the mass flight of subsidized objects westward, a migration that pushed the inefficient communist system to collapse. Khrushchev had won the debate: cheap goods from the East were more desirable, but the cost to the system was unacceptable. By this time, production and accumulation in the East Bloc had became goals in and of themselves, devoid of logic and use, no longer tied to the basic needs of the economy. In 1961, Khrushchev launched the Third Economic Program to increase the Soviet Union’s production of industrial goods at all costs. During a meeting that year, COMECON decided that the flow of people and products had to be stopped. On August 13, 1961 the border was sealed with a barrier constructed by East German troops.

The result was a city divided in two without regard for prior form or use. Over time, the Berlin Wall evolved from barbed wire fences (1961-1965) to a concrete wall (1965-1975), until it reached its full maturity in 1975. The final form would be not one but two Walls, each constructed from 45,000 3.6 meter high by 1.5 meter wide sections of reinforced concrete, each weighing 2.75 tons, separated by a no-man’s-land as wide as 91 meters. The Wall was capped by a smooth pipe, making it difficult to scale and was accompanied by fences, trenches, and barbed wire as well as over 300 watchtowers and thirty bunkers. (2)

With its Utopian aspiration to change society, the Wall was the last product of heroic modernism. It succeeded in changing society, but as with most modernist products, not in the way its builders intended. East Berlin, open to the larger body of the East Bloc, withered, becoming little more than a vacation wonderland for the Politburo élite of the Soviet Union and Warsaw Pact and a playground for spies of both camps. Cut off, West Berlin thrived. By providing a datum line for Berlin, the Wall gave meaning to the lives of its inhabitants. In recognition of this, Joseph Beuys proposed that the Wall should be made taller by 5cm for “aesthetic purposes.” (3)

As the Wall was being constructed in Berlin, Situationists in Paris and elsewhere were advocating for radical changes in cities as a means of preserving urban life. For them, the aesthetics of modernism and the forces of modernity were destroying urbanity itself. During this period working-class Paris was being emptied out, its inhabitants sent by the government to an artificial modernist city in the suburbs while an equally artificial cultural capital for tourists and industry was created in the cleaned-up center. Led by Guy Debord, the Situationists hoped to recapture the city by creating varied ambiances and environments and strategies providing opportunities for stimulation and chance drift. When Situationist architect Constant Nieuwenhuys deployed the floating

transparent layers of his New Babylon to augment the existing city with unique, flexible and transient spaces Debord condemned the project, arguing that the existing city was already almost perfect. Only minor modifications, were necessary, such as adding light switches to street lights so that they could be turned on and off at will and allowing people to wander in subways after they were shut off at night.

In his 1972 thesis at the Architectural Association, entitled “Exodus, or the Voluntary Prisoners of Architecture,” Rem Koolhaas found a way of reconciling modernism with Situationism through the figure of the Wall. Suggesting that the Wall might be exported to London and made to encircle it, Koolhaas writes, “The inhabitants of this architecture, those strong enough to love it, would become its Voluntary Prisoners, ecstatic in the freedom of their architectural confines.” Inside, life would be “a continuous state of ornamental frenzy and decorative delirium, an overdose of symbols.” Although officially proposing a way of making London more interesting, Koolhaas’s thesis is really a set of observations about the already existing condition of the real Wall. (4)

In choosing to encircle London with the Wall, Koolhaas recognized that it was not only the last great product of modernism, it was the last work of heavy architecture. Already in 1966, in his introduction to 40 Under 40, Robert Stern observed that an increasingly dematerialized “cardboard architecture” was “the order of the day” in the United States while in England, architects such as Archigram were proposing barrier-less technological utopias.(5)

Built of concrete, the wall was solid, weighty. It hearkened back to the days of the medieval city walls, which were not only defensive but attempted to organize and contain a world progressively more interconnected through communications and trade. Walls acted as concentrators, defining places in which early capitalism and urbanity could be found and intensifying both. So long as the modes of communication remained physical and the methods of making and trading goods were slow, nations retained their authority and autonomy through architectural solidity.

The destruction of the Berlin Wall in 1989 is concurrent with the pervasive and irreversible spread of Empire and the end of heavy architecture. Effectively, the Wall fell by accident. During 1989, mass demonstrations in East Germany led to the resignation of East German leader Erich Honecker. Soon, border restrictions between neighboring nations were lifted and the new government decided to allow East Berliners to apply for visas to visit West Germany. On November 9, 1989, East German Minister of Propaganda Günter Schabowski accidentally unleashed the Wall’s destruction. Shortly before a televised press conference, the Minister was handed a note outlining new travel regulations between East and West Berlin. Having recently returned from vacation, Schabowski did not have a good grasp on the enormity of the demonstrations in East Berlin or on the policy being outlined in the note. He decided to read the note aloud at the end of his speech, including a section stating that free travel would be allowed across the border. Not knowing how to properly answer questions as to when these new regulations would come into effect, he simply responded, “As far as I know effective immediately, right now.” Tens of thousands of  people crowded the checkpoints at  the Wall and demanded entry, overwhelming border guards. Unwilling to massacre the crowds, the guards yielded, effectively ending the Wall’s power. (6)

The Schabowski mis-speak points to the underlying reason for the Wall’s collapse: the lack of information flow in the East Bloc. The goals of Khrushchev’s Third Economic Program were finally met by the early 1980s, but by that point the United States was no longer interested in production. For America, the manufacturing of objects proved to be more lucrative when outsourced to the developing world. By sending its production overseas, America assured the success of its ideology in the global sphere while concentrating on the production of the virtual. Having spent itself on production of objects, as well as on more bluntly applied foreign aid, the Soviet Union collapsed. Manuel Castells observes that, “in the 1980s the Soviet Union produced substantially more than the US in a number of heavy industrial sectors: it produced 80 percent more steel, 78 percent more cement, 42 percent more oil, 55 percent more fertilizer, twice as much pig iron, and five times as many tractors.” The problem, he concludes, was that material culture no longer mattered. The PC revolution was completely counter to the Soviet Union’s centralization while photocopy machines were in the hands of the KGB. (7)

With the Wall gone, and with it the division between East and West undone, the world is permeated by flows of information. Physical borders no longer hold back the flow of an immaterial capital. Indeed, soon after the Wall fell, the European Union set about in earnest to do away with borders between individual nation states.

But through it all, the Wall has had no greater fan than Wall Winther Berliner-Mauer. In her view, the Wall allowed peace to be maintained between East and West. As soldiers looked over the Wall to the other side, they saw men just like themselves, with families they loved and wanted to protect. In this way, the Wall created a bond between men who would otherwise be faceless enemies. But Berliner-Mauer’s love for the Wall is far from abstract. For her, objects aren’t inert but rather can possess souls and become individuals to fall in love with. Berliner-Mauer sees the Wall as a noble being and says she is erotically attracted to his horizontal lines and sheer presence. While Berliner-Mauer is in a seemingly extreme, sexual relationship with the object of her desire, our animistic belief in objects is widespread across society. Adults and children alike confide in stuffed animals, yell at traffic signs, and stroke their sports cars all the time. Many people choose to love objects over their friends, their spouses, or themselves. Berliner-Mauer understands that the role of objects in our lives has changed. Objects are no longer tools that stand in to do work for us, or surrogates for normal human activities. Objects have come into their own, and as such we have formed new kinds of relationships with them that do not have a precedent in human interaction. The Wall is a loving presence in her life, and she loves it in return.

Faced with the tragedy of the Wall’s destruction, Berliner-Mauer created a technique she calls “Temporal Displacement” to fix her mind permanently in the period during which the Wall existed, even at the cost of creating new memories. (8) For Berliner-Mauer, marrying the Wall is a last means to preserve the clarity of modernism and the effectiveness of architecture as a solid and heavy means of organization. To love the Berlin Wall is to dream of holding onto the simple, clear disciplinary regime of order and punishment that came before we were all complicit within the system. In today’s society of control, as Gilles Deleuze writes, we don’t need enclosures any more. Instead, we live in a world of endless modulations, internalizing control in order to direct ourselves instead  of being directed.(9) After the fall of the Wall and the end of enclosures, even evil itself, Jean Baudrillard observes, loses its identity and spreads evenly through culture. An Other space is gone from the world. North Korea and Cuba are mere leftovers, relegated to tourism. There is nowhere to defect to: without the Wall, we must all face our own complicity in the system.(10)

(1) Elaine Tyler May, Homeward Bound: American Families in the Cold War Era (New York: Basic Books, 1999), 10-12.

(2) Thomas Flemming, The Berlin Wall: Division of a City (Berlin: be.bra Verlag, 2000), 75.

(3) Irving Sandler, Art of the Postmodern Era (Boulder, Colorado: Westview Press, 1998), 110.

(4) Rem Koolhaas, S, M, L, XL (New York: Monacelli, 1995), 2-21.

(5) Robert A.M. Stern, ed., 40 Under 40: An Exhibition of Young Talent in Architecture (New York: The Architectural League of New York, 1966), vii.

(6) Angela Stent, Russia and Germany Reborn: Unification, the Soviet Collapse, and the New Europe (Princeton: Princeton University Press, 1999), 94-96.

(7) Manuel Castells, End of Millennium, Second Edition (Oxford, UK: Blackwell, 2000), 26-27, See also the entire chapter on “The Crisis of Industrial Statism and the Collapse of the Soviet Union,” 5-67.

(8) http://www.berlinermauer.se (note from 2019: Eija-Riitta Eklöf-Berliner-Mauer died in 2015 and this site is now some kind of spam site)

(9) Gilles Deleuze, “Postscript on the Societies of Control,” October, Vol. 59 (Winter 1992), 3-7.

(10) Jean Baudrillard, “The Thawing of the East,” The Illusion of the End (Stanford, Stanford University Press, 1994), 28-33.

On Gardening

“If it is true that the next renaissance of human culture will be the reconstruction of the natural world in our cities and suburbs, then it will be the designers, not the politicians, who will lead this revolution. And plants will be at the center of it all.”
—Thomas Rainer, Planting in a Post-Wild World.

IMG_4066

It’s true, I’ve disappeared. It’s time to admit it. Architecture and academics have become boring. There are no new buildings of consequence and academics are plumbing the depths of irrelevance. Star worship killed both.

Few people who knew me in academics knew about my secret life, investing and managing real estate (anyone who ever took my Network City course should have received some good lessons on that) and that panned out well enough that I don’t have to prostitute myself for low-paying academic jobs anymore for the “experience” and “exposure.” There are many things I should do with this freedom: chiefly, proposals for art and museum installations, writing, and my book on the first decade of the Netlab. I am slowly working on all of these, but I found a new secret life, something else that has absorbed me thoroughly, something that I want to do: gardening.

Forget the term “landscape,” a holdover from the days when men wore cargo shorts and baggy T-shirts. Landscape implies finding an earth mover and puttering about with it, treating a plot of land as if it were a building, terracing it without regard to whatever might be living there. Periodically, claims will be made about clever schemes for land reclamation or water filtration but inevitably these will go awry since the designer will have given little thought for the plants and even less for the structure of the soil. Landscape needs a broad area to take into view, which necessitates clear cutting and large expanses, not the minute detail of gardening. Landscape suggests something for a public to view, not something productive for a household. Worst of all, around here, “landscapers” are illiterate madmen with gas-powered leaf blowers, whose sole job is to redistribute wealth into their pockets while driving me out of my goddamned skull. I saw one the other day, “Stone Age Landscapers.” Like something out of a Robert Smithson narrative, the name says it all.

IMG_3317

Early April, Toadshade Trillium (Trillium sessile) is surrounded by fiddlehead ferns and debris. 

Gardening is something altogether different. I embrace the amateur nature of the term, its lack of regard for academics and high art. These things don’t matter anymore. We all know this. But sticking your hands into the dirt every day does. Growing things for a particular area that you know well does. It’s not something you hire someone else to do, it’s something you do yourself.

Gardening is a massive investment of time. Learning plants takes time. When I started I didn’t know Virginia Creeper from poison ivy and took the advice of a friend who suggested we pull it up too. What a mistake, only now is it coming back to cover bare spots. Plants take time, a lot of time. You don’t just put them in the ground and walk away. By their nature, they grow and they grow slowly. One reason I plunged headlong into gardening is that I realized that the temporal nature of the project meant anything I did now would only start paying off years later. There are things I am doing now that won’t have real impact for a decade or two.

After three years of intense gardening, I am only beginning to understand how a half acre garden on the periphery of New York City can be radical in its own way. This hit home to me last night. Notwithstanding their previously dwindling populations, the fireflies came back in force this year. My sixteen year old daughter and I sat on the chairs in our back yard, amidst all the things that I had planted, and delighted in them. Planting pine trees, letting debris in the woody area of the property rot and offering unmown micro-meadows seems to have done the trick.

IMG_4981

A native Mountain Laurel (Kalmia Latifolia) bloomed in early May this year. 

My goal is to build a contemporary suburban pleasure garden, a place to restore some measure of peace to my family, friends, and myself even in a world gone mad. Logic is gone, we held its funeral years ago. In its stead is the raw anger of Brett Kavanaugh on one side and “woke” folk on the other racing each other in a doomsday death spiral. I have little time for that here, where I have gone, just as Diocletian turned to tending his cabbages after retiring from Emperor. My garden is a spiritual place, far from any church and its tired old songs. It stems from an earth-centric spirituality, a celebration of the cycle of life in its raucous abundance, drawing from warring pagan forces of soil, sun, and rain, not of some angry, white-bearded man-god better suited to faraway desert lands barren of living things. I have learned a lot about botany and horticulture, but I have also learned that these things have limits. I am after the deep connection with the living that science denies too often.

I have embraced the native plants that belong to this place, this town of Montclair, New Jersey, but I haven’t embraced the native plant movement and its assumption of original sin, guilt, and the rejection of pleasure. It’s merely more Protestantism in disguise and I read to much Nietzsche in college to revel in guilt. I don’t want to fill my yard with a patch of weedy-looking things. I am not pretending that I will restore a long-vanished landscape, nor do I assume to be primarily working for the pollinators or wildlife (although I did think about the fireflies a bit). Rather, I see such benefits as byproducts, just as habitat destruction and the spread of invasive species have been byproducts of recent landscape design. We can design byproducts into our plans, think about the side effects we want even as we create something for ourselves.

With this in mind, my plants are grouped in terms of the native plant communities indigenous to this place—meadows, woods, moist, dry, sun, and shade—along with a small grouping of native and non-native plants that traditionally have been identified with restorative properties (a “physick garden”) and a few food-bearing plants (sadly, the lack of controls on deer make my dream of a self-sustaining lifestyle difficult here).

IMG_5276 (1)

A non-native in my “physick garden,” Borage (Borago officinalis), which makes for an excellent addition to cocktails and is possibly the failure that I lament the most this season. The four foot tall plant fell over and died this week, its lower stem rotten, the victim of too much water early in the season.

I set out to make this garden as a personal project, but I’ve begun to think that it may be more than that, that this project could eventually serve as a model for a new suburban landscape. There is so much land here we could turn to our advantage and yet most of it is a green desert, bereft of both biological diversity and visual appeal. A painful aspect of learning what can be achieved with gardening is that it opens your eyes to the ugliness of most properties. Sad houses are bracketed by sadder yards, at best a forlorn tree or two sits in an unused front lawn, some tired mop-headed hydrangea compete with invasive barberry and euonymous amidst a sea of black-painted “mulch,” all of which telegraphs nothing more than the property owner’s laziness. The lack of intentionality boggles the mind, as does the poverty of it all.

IMG_4187

Early May and our Eastern Redbud (Cercis canadensis) has lost most of its flowers, carpeting the ground around it up by Highland Avenue. It is July now and the ferns and grasses near it have come in thick.   

In the last three years, I have planted some thirty-eight trees on this 1/2 acre, all but three of which have been native (two are apple trees and one is a Norway spruce that my daughter received on arbor day at her school). These species——Carolina Silverbells, White Pines, Magnolias, Paw-Paws, Eastern Cypresses, American Hazelnuts, Redbuds, American Dogwoods, and so on——are a mix of understory and canopy trees, nestled in amidst the tulip-beech-oak canopy that defines the edges of this property. I spray religiously with Deer Out, a non-toxic mix of cayenne peppers, peppermint and eggs and that seems to have largely discouraged the ravenous deer in this suburb. I have let many seedlings thrive where they fall, helping in reforestation. I have planted dozens upon dozens of bushes, at least fifteen native rhododendrons alone. Native perennials now dot my yard, with woodland ephemerals tucked wherever I can find a place for them and ferns——relics from the Carboniferous period when giant dragonflies the size of hawks flew about——by the hundreds. Even a few lycopods, perhaps the most alien and oldest form of land plants, have managed to find homes here. Where before there had been mulch or bare ground, now something grows. Still, the property eagerly absorbs as many plantings as I can throw at it.

IMG_5374

A recent haul. Two carts of native plants purchased on June 25th from the Bowman Hill Wildflower Preserve in New Hope, PA. It would be lovely to find natives from sources closer to home, but there are no nurseries that grow from local stock any closer and the price is right. 

But my race against time this summer has run out. It’s too hot and dry to plant more for now. I’ll have to wait until fall, weed out the invasive plants that infest my land, products of criminal nurserymen and heedless neighbors, and work on my other projects. Perhaps I’ll even blog from time to time, or work on my book and the other things. Soon enough, fall will come, and with her cooler nights and, I hope, quenching rains, another planting season will come.

On Networked Publics and the Facebook

At a recent party, an acquaintance asked where I’d disappeared to. Nowhere, I said, and when I inquired into what prompted their question, they responded by saying that they couldn’t find me on the Facebook. I was a little surprised about this since I am hardly the first person to deactivate or delete an account; in the United States and the European Union, the Facebook’s membership has peaked and has been declining for a few years while engagement is in a steep decline.

Still, the Facebook has effectively replaced most social communication for many people, so I initially had some anxiety about deactivating my account. I thought it would be hard to go without the Facebook since it gathered together people from throughout my life and career and also hosted a few key discussion groups that I enjoyed (mainly pertaining to modular synthesis). Two months later, I don’t feel I am missing out on anything; on the contrary, I am more at ease. Losing the constant noise of the Facebook feed is a joy. If anyone actually wants to get in touch, I am here.

The connections we establish on Facebook are superficial. Birthdays are a case in point: it’s lovely when people remember your birthday, but they don’t really, the Facebook prompts them too. And a couple of happy birthdays doesn’t excuse the toxic impact the site has had on politics, which I find much more oppressive than Twitter, which is used by a large number of journalists and bloggers. My most active Twitter circle is composed of some of the best writers in the architecture and urbanism today as well as a smattering of people involved with digital culture. So, when I mentioned I was quitting Facebook, William Ball sent a link to a scary article by Timothy McLaughlin about how Facebook’s rise in Mynamar fueled genocide while Frank Pasquale lamented that Facebook might wind up like Big Tobacco, accepting the shrinking of its domestic market since it is doing so well overseas.        

This gets me to thinking. Something has changed in network culture: the 2016 election and the end of the Facebook’s growth in the US and EU are indicators that the heady exuberance about the Net has turned to anxiety and dismay. That the methods of communication used by networked publics are fundamentally flawed in a way that earlier publics were not isn’t just something that scholars talk about anymore, it’s something we all face, daily. Back when we took on the term as an object of study at the Annenberg Center for Communications, we understood “networked publics” to refer to a group of individuals with a particular interest in connecting together digitally. Some researchers—not coincidentally sponsored by technology corporations rather than universities—mistakenly depicted social network sites as networked publics in themselves. This is a category error: with few exceptions (the first incarnation of aSmallWorld, intended as a gathering place for the mega-rich and their hangers-on comes to mind), social networking sites act not as publics in themselves but rather as hosts or platforms on which, in theory, individuals could participate in a variety of different publics.

But they don’t. Algorithms ensure posts only reach individuals who “like” similar content, inhibiting the discussion and deliberation necessary to create a public, supplanting it with a brown slime of happy (or angry, or sad, depending on sorts of things one “likes”) posts. Social networking sites profit off of various malignant actors who promote content covertly. Controversy is good, creating more engagement, regardless of the human cost.

If publics can’t form on social networking sites and if, for many people, social networking sites are the primary means of interacting with others, is it possible that publics and public discourse are becoming extinct as a consequence? When I see people saying “it’s time for us to have a conversation” when they really want to shame you into accepting whatever idea they have adopted as their cause that day, I wonder if we have become completely unable to engage in actual public discourse. After all, the modern category of the public is less than four centuries old. Why should we be so bold as to think it will endure?

 

Drones at the New School

I am delighted to announce that I will be presenting Perkūnas at the New School tomorrow during the Sonic Pharmakon conference put together by Ed Keller for the Center for Transformational Media (I am a fellow with CTM this year). See the schedule here. As a modular synthesist and with a sound sensitivity disorder, I am committed to the possibilities of drone and am fascinated to see Sunday’s events (alas, I am on a plane back from Kauai today so will miss the day).

Aleksandra Kasuba, 1926-2019

Artist and designer Aleksandra Kasuba passed away on March 5. Kasuba was a brilliant force and I knew her throughout my life. Among my earliest memories is crawling around in her Lived-In Environment, a radical transformation of the first floor of the Upper Western Side brownstone that she and her husband owned through tensile constructions. My parents were friends with her and her husband, the sculptor Vytautas Kasuba and we would see each other periodically.

 

Although I lost touch with her after I went to graduate school and then moved to Los Angeles, we reconnected on the occasion of my giving a lecture on her work at the National Gallery of Art in Lithuania to accompany the exhibition of Spectrum: An Afterthought. During those 25 years she had kept tabs on me and agreed that I could give the lecture—apparently other historians had failed to understand her work properly—and I took copious notes on our discussions. Last week I wrote her obituary for the Architects’ Newspaper. See it here. I am revising my talk for the catalog of the retrospective of her work to take place there in 2020.

Year in Review 2018

The Year in Review 2018

I let six years go by without a Year in Review post, restarting the tradition last year. Not this time, although, with the frenetic pace of news this year, it seems like we have all aged six years in 2018.

Things are in a profound state of in-between. On the one hand, the Trumpian kleptocracy is accelerating. With Kelly and Mattis leaving in December, the “adult day care center” has closed, leaving only a pre-school version of Lord of the Flies in the White House. And yet, the end seems to draw near for this vexed time. Voters gave a resounding rebuke to Republicans in Congress, one that may ultimately be generational in nature and that gives Democrats subpoena power. Expect action soon. What’s in those tax returns? How much crony capital have Jared and Donald received over the years? By this time next year, we should know. Moreover, the Mueller investigation is accelerating, drawing closer and closer to the great kleptocrat’s inner circles even as we are left guessing at what sort of revelations we will learn in the months to come.

But that said, massive global instability is the price we pay for Trump. Authoritarian forces are on the rise throughout the world. It would be easy enough to say that these forces have been there all long, but its more accurate to say that the actions of individual players still matter. Trump was a colossal misfire, an eruption of senile admirers of fascism who think that a country of coal miners, machine guns in every classroom, and Christian sharia law will bring Jesus back, no doubt riding on a dinosaur. But with the markets on a rolled coaster ride that ultimately ended down in almost all sectors worldwide, we have to wonder how long business will find the radical Right palatable. Constant turmoil and increased tariffs are making CEOs wonder how useful Trump really is. It’s time to take gramps out of the White House and put him in a nursing home.

Beyond the rise of authoritarian power, 2018 was the year in which the rapid pace of climate change became obvious to anyone with a pulse. I am not a big fan of Alexandra Ocasio-Cortez (democratic socialism is a ticket to another right-wing victory), but her Green New Deal just makes sense. The US has spent trillions upon trillions subsidizing oil in various ways (from outright subsidies to the construction of roads which are, of course, paved in oil) and fighting wars in the Middle East to safeguard fossil matter, why shouldn’t we treat this as energy independence as matter of national security? There are 50,000 coal miners in the United States, less than the 89,000 employees of Sears who will lose their jobs this year’s and far less than the 1.6 million university faculty in the USve. If the Democrats want to win in 2020, running of a platform of stopping the rise in temperatures worldwide and the ballooning national debt while restoring basic rights and freedoms taken away during the Trumpic regime would be a good place to start (given that the GOP has forgotten about the deficit now).

As for architecture. What is there left to say about it anymore? Starchitecture has faded, nobody gets excited about cool forms anymore. How can we be surprised? No starchitect is making interesting buildings, in fact the whole movement has been something of a bust. Second, architecture is no longer the profession that shapes space, digital technology is. Failing to recognize this dooms the profession to irrelevance, like heraldry in the days of mustard gas.

But architecture isn’t the only institution without purpose. Silicon Valley, it seems, has finally met a time in which nobody cares about what it makes or promises. People are not only tired of big tech, they are tired of startups that promise the world when their only business plan is to be acquired as soon and possible. In fact, for all its promises,startup culture was a bust and it is far smaller than it was two decades ago. Apple made its best products ever (I am typing this on one of the amazing third generation iPad Pros that I bought), and was punished for it by a massive drop in its stock price.

If any tech became widely accepted by the mainstream in 2018, it was the Internet of Things and the Smart Home. Amazon’s Alexa, Nest and Ring’s video doorbell, and Lutron’s Caseta system were among the winners in this transformation of our interior lives. There is nothing terribly radical about the smart home and, frankly, a lot of the panic about surveillance with the hardware is silly (as if smart phones don’t already do this). But embedded technology is everywhere now.

Still, it’s odd how art (and architecture) misses this change. For want of anything else, we are still in the era of post-Internet art, an idea which, unfortunately, I am somewhat to blame for. If there was some merit to thinking about how network culture permeated art in 2011, talking about “post-Internet art” now simply is about as useful as talking about Abstract Expressionism as “post-automobile” art. Art, like architecture, has lost any purpose or drive forward. Technology and art have drifted apart again and only a few of us hack away at the intersection of the two. Still, art and architecture are always falling into ruin and being reborn. Perhaps this time will be no different and the work we are doing will lead to a rebirth?

The academy is sick as well. Years of poor management practices and bloated administrations have gutted the arts and humanities as faculty were forced to take on heavy teaching loads and real research has been eliminated (in case you wondered, I left Columbia when the new Dean did away with the entire research arm of the school to appease the finance office). Two decades ago, I decried “staff-ism” in schools, but now that is all that’s left.

I left teaching completely this year, resigning from my position at University of Limerick, Ireland after thirteen years and bringing nearly thirty years of teaching to end. In large part, it was the basic inability of universities to function that drove me away. What good is it for me to waste my time trying to jump through hoops to get paid when there are people in finance offices whose job literally is to ensure that faculty don’t get paid (I’ve been told this point blank)? And teaching itself isn’t much fun anymore. Students, for their part, are more interested in looking at their instagram feeds than in listening to what I have to say. It’s the opposite of the 1960s when students proclaimed the irrelevance of their teachers. Now, faculty proclaim the irrelevance of their students. Bah. It’s not worth it. It was a mistake to keep going over the last couple of years. I may come back to education one day—I have many great memories that come from my students and many of them remain my friends to this day—but now is a time when the university is very much irrelevant. Independence is what we need, not sick institutions.

Speaking of sick institutions, there is welcome news this year regarding Facebook: we saw the first signs of that hated enterprise starting to implode. Zuckerberg’s pathetic attempt to get a date by building a Web site has wound up doing tremendous damage to the Internet with its reduction of all content to a general level of idiocracy. Older forms of Internet communication such as blogs, email-mailing lists and Internet forums are dying and since nobody reads books or magazines anymore, we communicate less than we did thirty years ago. Instead, we don’t even get FarmVille, we get social diarrhea. Nobody likes Facebook. Independent voices are needed on the net again. It’s not up to someone else to provide them, it’s up to us.

I rebuilt my Web site last week in hopes of returning to being an independent voice in the field. I finished the last year in review with a similar resolution, maybe this year, I’m getting cranky enough that’ll actually happen.

I am trying to break the Internet

I don’t see how we can remain enthusiastic about network culture. In the decade since the release of the iPhone, the Internet has gone from being a playpen for geeks and outsiders to the primary theater for politics and culture. Even three years ago the thought that a major global leader would use Twitter to announce major policy initiatives would have seemed futuristic and a little naïve. Now we have it and it’s the darkest time most America has collectively experienced in memory. We lurch headlong into a future, but it’s a new bad future.

And, as we seem to be drowning in information, we seem to have lost our ability to communicate and absorb knowledge. Almost nobody reads and writes blogs anymore (please don’t get me started about Medium and the final, thorough destruction of independent content its startup model is premised on). Magazines and journals are well and truly dead. Most books by theorists and academics are soundly ignored too, which is probably a good thing given that theory has become permeated by a neo-fascist identity politics. Outside of the telecocoon of our partners, children, closest friend or two, and immediate work associates, we no longer call each other, we no longer e-mail each other, and we ghost each other as much as we text each other. Earlier forms of Internet culture are also on the rocks: listservs have become replaced by Facebook groups that produce no thought or discussion of any substance whatsoever, and most online forums have died as well. The aforementioned Twitter should be dead, but is kept alive by our desire to see what the lunatic in the White House will say next, and otherwise serves as home to a few misfits and general oddballs. Facebook absorbs everything, reducing all human communication to nothing and algorithmically directs us to only see those posts that give us a fleeting satisfaction. It’s more imagistic spawn, Instagram, is the model of the new Internet, driving us to constantly one up each other with a lifestyle pornography. There are, of course, exceptions—I actually like Reddit and there are some niche online forums that have moments of productivity—but it’s bad out there. I have played my own part in migrating to these awful commercial platforms and regret it. Something must be done. Endless promises and little delivery.

In the case of this site, I’ve resolved to fix matters over and over again, but each time was undone by the content management system, Drupal. Drupal is awful. Way back in 2005, Drupal seemed like a good choice, with its module-based open architecture, and the promise of a content management system that could go far beyond a blog. At the time, I was fascinated with the idea of the networked book and Drupal seemed to offer such functionality built in. Unfortunately, Drupal long ago began to resemble a 1980s American car company, suffering from over-complexity, putting design and user interface last, and unable to got basic features working. A critical flaw is that the development team long ago decided that “the drop is always moving”: each major version of Drupal breaks all the existing plug-ins and themes without which a site is ugly and limited. And therein lies the rub. I should have updated my site eight years ago when Drupal 7 was released, but the horribly botched release of that version would have broken my site thoroughly if it had been possible to update it at all (I am now forgetting if it was Drupal 7 or 8 that did not have upgrade capabilities when first released and the upgrade cycle to Drupal 6 cost me over a month of work). Add to that constant trouble with excessive memory overhead, the obliteration of comments by spambots, together with a general feeling that the whole creaking mess was going to explode like a steam-powered Soviet tractor meant that I knew I’d never be upgrading Drupal. But where to go?

A New Year’s resolution for 2017 had been to redo my site in a new content management system and I upgraded the front end of my site to Kirby. Kirby is fantastic. It took me an afternoon to build the site (I am still running Kirby over at the network architecture lab. In Drupal it would have taken a week. Still, although now I had a decent portfolio together, the blog languished and I made a handful in 2017 and only one post in 2018. In part, this was due to life: I am still cleaning up after the decades of neglect that our house suffered before we bought it, I am spend more time with my family, and I am working on my conceptual art and sound practices. Still, doing anything in Drupal was a nightmare and I stayed away from any substantive blogging.

So it happens that we were away skiing at Smuggler’s Notch in Vermont this week and, as so often happens, we had freezing rain all day. So I decided to finally upgrade this blog and move it to WordPress. I had used WordPress before in Jo-Anne Green’s Networked book project and that had pushed the platform beyond what it was capable of at that time, leaving me with a bad opinion of the platform, but during the intervening years, WordPress has matured into a capable platform (even with the recent growing pains caused by the Gutenberg blogging interface).

Frédéric Gilles’s amazing Drupal to WordPress plugin imported all of the data from Drupal—even comments!—better than I had ever dreamed was possible, better than I would have expected from an upgrade within Drupal. I’ve long loved Indexhibit and use it on AUDC’s site so I was glad to base the new site on Leanda Ryan’s Inxhibit theme (much as  I love Indexhibit, it simply isn’t designed for blogging). A day of work later and, although there are some bugs here and there, I am confident enough to replace Varnelis.net with WordPress.

One of the major impediments to blogging that I faced with Drupal is that inputting text on the Web is a nightmare (I can’t tell you how many posts I’ve lost by accidentally hitting the wrong key on Drupal) and uploading text inevitably seemed to introduce formatting issues, no matter how hard I tried. In the case of WordPress, I knew that there were many more tools available at my disposal so I decided to try iAWriter and found it worked flawlessly. I started writing this post on my Mac, seamlessly picked it to my iPad Pro and posted this entry. This is simple, the way Content Management Systems were supposed to be, without losing the independence that blogs make possible.

I’ve also put Feedly front and center for daily reading on my iPad. There are plenty of great blogs out there are acting as a resistance to the managed content of Facebook, Twitter, and Medium and I am setting out to rediscover them. With these changes to my blog, maybe, just maybe, I may once again join them as well.

 

Network Histories at Michigan, 3/8/18

I am delighted to be delivering the keynote address at the P+ARG Conference at the University of Michigan on March 8, 2018, at 6pm. More details here

My talk is titled "Network Histories: Baran and Milgram in Perspective" and the abstract reads roughly … 

The foundational work done by social psychologist Stanley Milgram and telecommunications researcher Paul Baran on networks in the 1960s remains profoundly influential today, establishing the basis of network theory. But both projects are more complicated than they seem: Milgram’s famous “Six Degrees of Separation” appears to have been largely fabricated while Baran’s plan for a “Distributed Network” is inevitably read within a retrospective mythography. This talk sets out to uncover not so much a theory of networks as an ideology of networks, seeking not a celebration but rather an understanding.

Continue reading “Network Histories at Michigan, 3/8/18”