Photographs from this project are on display in Carlos Gor’s exhibit “Vernacular. Diálogos Entre Micropolíticas Del Paisaje” at La Madraza, Centro de Cultura Contemporanea in Granada, Spain. There are many more images at the gallery above and I will eventually put up the full text of the essay as it going into the accompanying catalog.
In his 1961 book, Megalopolis: the Urbanized Northeastern Seaboard of the United States, geographer Jean Gottman adopted the Greek term “megalopolis” to refer to vast conurbations such as the northeastern seaboard of the United States, from Washington DC to Boston. This megalopolis, he observed, would not have a uniform population density but rather would be composed of multiple urban centers and the interstitial spaces between them.
Even as megalopolitan territories have developed across the globe and the price of land in their boundaries has skyrocketed, interstitial spaces within them persist in terms of what Ignasi de Sola-Morarles called terrain vague, leftover spaces that have been abandoned for some reason. Understanding the potential of such sites—not to develop them, as is the vogue for too many architects, but to leave them as is or to allow damaged ecological systems to be restored—is a critical task for urban researchers today.
In this photo essay, I look at a group of such areas in Essex County, New Jersey, the densest and most urbanized county of the densest and most urbanist state in the United States, specifically at huge swamps remaining from the draining of Glacial Lake Passaic that comprise roughly ¼ of the county’s acreage. My interest is in both the history and conflicts over these areas and the tenuous relationship individuals have with the borders of such areas, especially now that extreme weather events have become annual occurrences.
Perhaps you are here because Facebook is down. Good. This site isn’t. Stay here, then visit some other web sites and don’t go back.
This is going to be an unpopular post. If this makes you upset or hurts your feelings, sorry, it’s said with love and concern for my dearest friends and family. If you aren’t on Facebook, good for you! Pat yourself on the back and convince anyone you can to follow your example.
Facebook is a terrible company and awful thing for democracy. It pretends to offer belonging when all it really is out to do is to exploit it’s members. I am thoroughly disappointed by my friends who use it all the time and should know better (ahem, if you are an academic, this means YOU… ). By 2016, the average Facebook user spent a staggering fifty minutes a day on that site chock full ‘o nuts. Want to find out how crazy it is? Go visit r/hermancainaward on Reddit and ask yourself what site are the contributors posting excerpts from? No, it’s not Twitter. Facebook is toxic. Get off it.
I’ve spent about the same time on Facebook over the last few years as off and I periodically shut down my account. The only two reasons I’m still on Facebook are that (most of you) people are too lazy to visit this site without it being spoon fed to you via a post on Facebook. I can see this clearly in my logs and, frankly, it’s pathetic. Facebook is not ok. Have we not learned anything yet? Just yesterday, Frances Haughen the brave Facebook whistleblower on 60 Minutes revealed just how bad things were. An internal Facebook document baldly stated: “We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world.” The company, Haughen explains, is hiding this information from the public. Facebook is like Purdue Pharma, which heavily marketed its opiods even though they knew how addictive they are. Facebook is no different. Walk away from it. If you can’t visit people in real life, get an RSS reader, visit Reddit, go to Twitter. Stay away from Facebook, it’s evil.
People claim that Facebook brings people closer together. No, it does not. It replaces lived relationships with relationships mediated by clicks and likes. Facebook’s algorithms choose what you see. Facebook monitors your communications for keywords so that it can market to you. People over 40 may remember the days when we used to call each other for an hour on the phone or write letters to each other, even emails. What happened to that? There was genuine communication there. Think of all of the books of letters that have been published over the years. Nobody wants to see their shitty Facebook posts.
Over the last fifteen years, I’ve watched as social interaction has died out and formerly lively internet forums have died while Facebook has taken over their purposes. If you maintain a forum on Facebook, shut it down. It’s bad for everyone involved. Read a book or a magazine, listen to some music, visit the Internet if you must, don’t visit Facebook. When Facebook is back (I hope never), I will be reducing my presence there still further. Do your part. Don’t promote Facebook, fight it tooth and nail.
This is a personal reflection. I’m not sure anybody else on Earth feels this way, but if you do, or if you have found ways to keep travel alive for yourself, do leave me a note.
I don’t understand the point of tourism anymore.
Call it the existential problem of tourism. Whether Mickey Mouse hat and fanny pack wearing tourist or avant-garde academic, we would leave home to find stimulus in the unfamiliar. But how does that apply today anymore?
Overtourism together with the migration of merchandising and social behavior to online venues has undone difference between places. Will an Irish pub in Ireland be better than an Irish pub in a random town in the US? I’ve been to dozens upon dozens of Irish pubs in Ireland on my many Hibernian trips and quite a few in the US and it’s not clear. If I go into a random restaurant in Rome, will the person next to me be a Roman, an American, or a German?
Residents of city centers worldwide have left in favor of short term rentals and with them, stores have left to be replaced by tourist restaurants and tsotchke vendors. Rents have generally skyrocketed as well. I have observed that the quality of restaurants peaked around 2000, maybe a little after that. It may that when one is young the world is new, but just as much, I suspect, rising rents have priced out quality. If you have to spend your money on rent, how much is there to spend on good labor or on good ingredients? Rising rents have squashed not only innovation, but also old classics. Ideally, a city would have a balance of both. Energetic young bars and restaurants are easy to admire, and it’s easy to say that old bars and restaurants are stuffy and boring, but old they also have distinct character, full of the memories of the city, often providing an experience of going back in time like in Madrid’s La Venencia. That’s hard to do now and La Venencia is one of the few places to guard its status carefully.*
I suppose I was lucky when I travelled in my academic career. As a historian of architecture and network culture, I usually had an excuse to go somewhere: Istanbul, Rotterdam, Valparaiso (Chile), Stockholm, Knoxville, Berlin, Madrid, Auckland, Munich, New Orleans and on and on. Generally, I’d be going somewhere because I’d be delivering a lecture or teaching a course. This would often result in less than ideal conditions for tourism since it’d inevitably take place in the academic year and I’d only have a day or two before I had to head back to teach, sometimes less (I once went to Vancouver for 90 minutes, although in fairness I had already been there for a few days before). Sometimes I’d have specific works of architecture or urban conditions to see. But that too would bring up the central problem of tourism, which is that it is voyeuristic.
As my followers well know, unlike many contemporary academics, I don’t subscribe to the naïve notion that cultural appropriation is inherently bad. On the contrary, it’s at the very heart of the human condition and without it, we would not only be much poorer, much more xenophobic and racist, less likely to see the other side. That said, I find it uncomfortable to walk into a church these days, even the Vatican or the Duomo in Florence. Who are these people? Why are they practicing these strange, superstitious customs? It’s 2021. I went to a Catholic elementary school for a few years but my parents always stressed that they were European Catholics and didn’t practice. As time has worn on, I simply don’t understand them. A stone circle or a sacred altar from pagan Lithuania makes infinitely more sense. Similarly, I couldn’t bring myself to visit the Blue Mosque in Istanbul and I doubt I’ll go back to the Hagia Sofia when it becomes a mosque again. Going to see Buddhist temples only reinforces that most forms of Buddhism are incomprehensible to the largely twentieth century practice of Western Buddhism. It all seems very voyeuristic and as a practice voyeurism is only acceptable when it is a game played with an exhibitionist aware of their practices. Otherwise, it is uncomfortable at best and a form of violence at worst.
And what else does one do when abroad besides look at buildings? You can go see some museums, but a corollary to overtourism is that one has seen too much. There are too many museums today and most are disappointing. Even the storied ones in wealthy cities frequently disappoint. The lighting of Las Meninas at the Prado is a great example. It’s impossible to take in the painting because of the reflections on the surface. I’ve been to many museums in wealthy European cities and been really disappointed. American museums are no different. Going to MoMA is a horrible experience. The new Whitney is deeply unpleasant. These are not good places to see art. The modern and contemporary sections at the Art Institute of Chicago are tragic, victims of short-sighted decisions (notably the third-rate Edlis/Neeson collection which is inexplicably in public display for a stunning fifty years, at which point the whole thing can be properly binned). Dia: Beacon, which used to be one of the best museums in the world, has stumbled dramatically under recent curatorship: two shows of Charlotte Posenske’s oeuvre? One is by far enough. Sometimes I get lucky, the show of Constant’s work that I stumbled upon at the Reina Sofia was a stunning surprise. The Akron Museum of Art has a remarkable collection of postwar art, much more interesting that than at the Art Institute of Chicago. But I find such experiences rarer these days as museums pander to the woke and their similarly privileged friends, oligarchs seeking to launder money.
So why tourism? Modern tourism dates itself back to the 18th century practice of theGrand Tour, when nobles and wealthy British men would go abroad to France and Italy as a rite of passage during which they would radically unsettle their sensorium by seeing cultures quite unlike theirs, together with the great works from the Roman era to the Baroque. The idea was that embodied knowledge was even more important than book knowledge. But what to make of it now? Can one really be invigorated by travel in such a manner except maybe by, as has become the vogue lately, flying into orbit around the Earth? There is no question I would do that, but the cost of space travel is still ridiculously high.
Visiting natural areas is no different. The “wild” places that provided the antidote to civilization have now been thoroughly trampled. Our state parks are full of botanical escapees from home gardens. The selfie stick is as hard to avoid in front of Yosemite Falls as it is in the Colosseum. I suppose that gardens and botanical centers are still worth going to, but I am not terribly interested in formal ones or ones driven by landscaper architecture as they remind me of the damage we have done to the environment.
Can one undo this with alternative spatial practices? I don’t think they are so productive anymore. Debord’s dérive, the “technique of urban passage through spatial ambiances” was meant as an antidote to both urban boredom and to the planned, scheduled tourism of the day, but take a dérive through the city today. Will you really feel that different in Brooklyn or Manhattan? In Queens? I don’t think so. It’s all more of the same. Behind the windows are people watching the same Netflix shows you are.
Mind you, there are places of genuine difference these days, but should we really travel to them? I mean COVID aside, which makes it virtually impossible to travel to Asia these days, do the Japanese really want Americans or Europeans there anymore? They are much less interested in studying abroad these days. Do the Chinese? Rates of English language instruction are falling there. Afghanistan is certainly different, but it’s definitely not a good idea to go there.
COVID may change the nature of tourism. I dearly hope it does. The migration of jobs out of the city may free up space for residents to flock back in, or for more Airbnbs. Cities such as Venice, Amsterdam, and Rome are actively looking for more sustainable ways forward, culturally and ecologically. I certainly hope so. We all need it if we are to find actual meaning in travel again.
*The last time I was at La Venencia a tourist was literally shown the door for trying to order a beer. You’ll also be shown the door if you are dumb enough to try to take a selfie (there are signs on the wall about this), although this is as much resistance to tourism as it is a holdover from the Franco days when it was a Communist hangout and photos could mean the death of patrons.
In February 2020 I had the opportunity to spend a week in Owego, New York for a residency at Signal:Culture. This was a purely research-based residency working on a software tool. For my 2016 project Perkūnas, I developed a Python back end for indexing the number of smartphones (and other WiFi-operated devices) on Eurorack modular synthesis equipment with reasonable precision. During this residency, I refined this back end, optimizing it for video synthesis with LZX modules. There was never any goal of producing anything finished that could be displayed in an art gallery, but the following coarse feedback video displays an instance of how this worked. During one of the many dinners we enjoyed together, I talked to my colleague Martin Back about many things, and during one, we discussed the new plague emerging in China. In one of the dumbest statements I ever made, I said that I thought the Chinese would contain it and it would blow over quickly.
According to conventional chronological schemas, 2020—not 2019—is the last year of the 2010s.* This is convenient since, as I pointed out in last year’s premature review of the last decade, the 2010s were “the decade of shit” and 2021 is a stinking pile of shit. The worst decade since World War II ended with the worst year since 1945.
My “year in review” posts are usually almost as late as my taxes and when I finished last year’s post on February 12, we were all well aware that COVID was out there. Now, no question that I missed the severity of the pandemic back then, but I was on the money about its psychic effects. For all of the horror of COVID, it isn’t horrible enough. COVID is banal. Instead of bleeding out through all of our orifices as with Ebola, COVID is “a bad case of the flu” that leaves people dead or with debilitating cardiovascular and neurological ailments. But how different is my diagnosis, really, from what happened?
Now sure, this year  we’ve already had firestorms the size of Austria ravaging Australia, a rain of rockets in Baghdad, Ukrainian jetliners getting shot out of the sky, a deadly pandemic in China caused by people eating creatures that they really shouldn’t, and the failure of the Senate to uphold the rule of law, but the banality of it all is crushing. While the Dark Mountain set drinks wine around a campfire, gets henna tattoos, and sings along to songs about the end of nature, for the rest of us, it’s just an exhausting, daily slog through the unrelentingly alarming headlines.
COVID brought us yet more crushing banality. The Idiot Tyrant is gone, but we are trying his impeachment yet again. Everything changes, but nothing changes. We were all the Dark Mountain set this year, sitting around our campfires, singing songs about the End. It was another atemporal slog, one day bleeding into another, every day a Sunday in a country where everything is closed on Sundays and there is nothing to do, every day stranger and more disconnected than the last, something captured in comedienne Julie Nolke’s series of videos entitled “Explaining the Pandemic to My Past Self.”
Amidst the disconnection, the Jackpot—or William Gibson’s term for a slow-motion apocalypse—cranked up a couple of notches. Just surviving the year was an accomplishment. The balance of life has been thoroughly disrupted and that disruption isn’t going away any time soon. It’s not just COVID: we now feel certain that there will be more pandemics, more massive wildfires, and more superstorms in our future. The Earth isn’t dying (sorry, climate doomers), but there will be huge losses of species worldwide, human population decline is well underway in advanced societies (the US is finally on the bandwagon here), and massive deaths will take place across the planet until the population comes back to a sustainable level decades from now.
But the premise of the Jackpot is that it isn’t a final apocalypse: there will be another side. In his Twitter feed (@GreatDismal), even Gibson focuses on the horrific and unjust nature of the Jackpot, but there will be winners, selected on the basis of wealth and sheer dumb luck. What might this say about the US election and the fact that 46% of Americans voted for a cretin? Now, there is nothing particularly new about melding Tourette’s and dementia into a public speaking style, there are plenty of lunatics sitting on their porches screaming obscenities at their lawn ornaments. Everybody knows that Uncle Scam’s persona as a billionaire—or rather the King of Debt (his own term!)—is an act. The man with the golden toilet is not a successful businessman. He is weak, a loser who can’t stay married or stay out of bankruptcy court. Four years of misrule ended in abject failure: defeat in both electoral and popular votes, being banned from social media and, with his businesses failing, being forced out of office in shame to face an unprecedented second impeachment, an array of civil litigation as well as criminal indictments for fraud, tax evasion, incitement to riot, and rape. But this—not a misguided notion of him as a success—is the real point of his appeal. The short-fingered vulgarian is a life-long loser, a reverse Midas whose every touch turn gold to lead. But in the face of the Gibsonian Jackpot, his appeal was not as a stupid version of Homer Simpson, grabbing whatever scraps he can and, when that failed, LARPing as President, destabilizing society, and just blowing everything up.
LARPing was big in 2020, which saw the attempted kidnapping of Michigan Governor Gretchen Witmer by wingnut idiots, various insane protests by COVID deniers, the attempted coup of the Capitol Insurrection, and the riots developing after the Black Lives Matter protests. BLM was the standout among these, not only a good, just cause, but also because the majority of the protests themselves were peaceful—such as the one in our town of Montclair, New Jersey. None of that was LARPing, but the riots that accompanied it were. For the most part, this was less people with genuine greivances and more Proud Boys, Boogaloos, anarchists, and grifters who came in to loot and burn whatever they could down. Although there were kooky moments on the Left like the Capital Hill Automonous Zone, Antifa, for however much it exists, didn’t do much, certainly proving to be far less trouble than white supermacist-infiltrated police forces in paramilitary gear. Still, the widely-vaunted second Civil War never came about and the arteriosclerotic LARPers on the Right limped off the field in defeat after their they got a spanking at the January putsch.
A number of observers at both the Capitol Insurrection and CHAZ —including some of the idiots who took part in it—noted that these events felt much like a game, specifically an Alternate Reality Game (ARG). In a typical ARG, players look for clues both online—think of the QAnon drops, the Trumpentweets, or the disinformation dished out by the skells at 4chan, 8chan, and so on—as well as out in the world. Jon Lebkowsky, in a post at the Well’s State of the World and Clive Thompson over at Wired compare QAnon to an ARG. Indeed, gaming is taking the place of religion (whichever grifter figures out how to meld this with Jesus and his pet dinosaurs will get very rich indeed), with the false promise that playing the game and winning will deliver one to the other side of the Jackpot. Somewhere, I read that when asked what he would do differently if he had made Blade Runner a decade later, Ridley Scott replied that he would be able to skip the elaborate sets and just point the camera down the streets of 1990s Los Angeles. Today, the same could be said for the Hunger Games today.
But not everything was LARPing. If Cheeto Jesus is an icon for LARPing losers, Biden was elected on the premise of staving off the Jackpot by returning adults to the White House. This is not a bad thing, we might as well try. Still, from the perspective of Jackpot culture, the most interesting political development of the year was the candidacy of Andrew Yang whose cheery advocacy of Universal Basic Income (aka the Freedom Dividend) masked the dark, Jackpot-like nature of his predictions. Let’s quote Yang’s campaign site on this: “In the next 12 years, 1 out of 3 American workers are at risk of losing their jobs to new technologies—and unlike with previous waves of automation, this time new jobs will not appear quickly enough in large enough numbers to make up for it.” No matter how friendly Yang’s delivery, there is a grim realism to his politics, an acceptance that things will never be better for a massive sector of the population. Certainly some individuals will find ways to use their $1,000 a month freedom dividend as a subsidy to do something new and amazing, but 95% will not. Rather, they will form a new and permanent underclass as they fade into extinction. Again, the point of Yang’s candidacy isn’t the cheerleading for math and STEM, it’s the frank acknowledgement that the Jackpot is already here.
On the other hand, toward the end of the year, Tyler Cowen suggested that we might be nearing the end of the Great Stagnation (he is, of course, the author of an influential pamphlet on the topic) and you can find a good summary of the thinking, pro and con by Cowen’s student Eli Dourado here. In this view, advances such as the mRNA vaccine, the spread of electric, somewhat self-driving vehicles, the pandemic-induced rise of remote work, and huge drop in the cost of spaceflight are changing things radically and could lead to a real rise in Total Factor Productivity from the low level it has been stuck at since 2005. Is this a sign of the end of the Jackpot? Unlikely. That won’t come until a series of more massive technological breaks, probably (but not necessarily) involving breakthroughs in health (the end of cancer, heart disease, and dementia), the reversal of climate change, working nanotechnology, and artificial general intelligence. But still, there are signs that early inflection points are at hand.
Personally, we experienced one of these inflection points when we replaced our aging (and aged) BMWs with Teslas. I wound up getting a used Tesla Model S last January and then immediately turned around and ordered a brand new Model Y that we received in June. No more trip to the gas station, and while “Full-self driving” is both expensive and nowhere near fully self driving, it is a big change. Longer road trips—which under the pandemic have been to nurseries on either side of the Pennsylvania border to buy native plants—have become much easier, even if I still have to keep my hands on the wheel and fiddle with it constantly to prevent self-driving from disengaging. But harping on too much about the incomplete nature of self-driving is poor sport: in the last year, Tesla added stop light recognition to self-driving and a new update in beta promises to make city streets fully navigable. Less than a decade ago, self driving was only a theoretical project. Now I use it for 90% of my highway driving. That’s a sizable revolution right there. Also, the all-electric and connected nature of these cars makes getting takeout and sitting in climate-controlled comfort in my vehicle when on the road a delight. Electric vehicles were a big success this year and in our neighborhood which is a bellwether for the adoption of future technology (when I saw iPhones replace Blackberries on the bus and train into the city, I bought a bit of Apple stock and made a small fortune) and Teslas have replaced BMWs as the most common vehicle in driveways.
Back to the pandemic, which accelerated a sizable shift in habitation patterns. Throughout the summer, there was a lot of nonsense from neoliberal journalists and urban boosters about how cities are going to come back booming, but with more bike lanes, wider sidewalks, less traffic, and more awesome tactical urbanist projects to appeal to millennials. Lately, however, those voices have fallen silent and with good reason. In this suburb the commuter train platforms are still bare in the mornings and the bus into the city, once packed to standing room only levels every evening, hasn’t run in five months. A friend who works in commercial real estate says that occupancy in New York City offices is at 15% of pre-pandemic levels. Business air travel is still off a cliff. Remote work isn’t ideal for everyone and every job, but neither was going into the office. For sure, the dystopian open offices, co-working spaces, and offices as “fun” zones are done and finished. People are renovating their houses, or upsizing, to better live in a post-pandemic world of remote work. Another friend who works for a large ad agency told me that they did not renew their lease for office space and do not plan to ever go back to in person work, at least for the vast majority of the staff. When employees gain over two hours a day from not commuting and corporations save vast fortunes on rent, remote work seems a lot more appealing. Retail sales here and in the surrounding towns have gone through the roof, just as they have in many suburbs.
But it isn’t just suburbia that has prospered at the expense of the city, exurbia has returned too. Way back in 1955, Auguste Comte Spectorsky identified a growing American cultural class that he dubbed “the exurbanites” made up of “symbol manipulators” such as advertisers, musicians, artists, and other members of what we today call the creative class. Spectorsky observed that many of these individuals eventually tended to drift back to the city. This time may be different. After two decades in the city, the creative class is turning to places outside the city with attractive older houses and midcentury modern properties, walkable neighborhoods (virtually all of Montclair, for example, has sidewalks), good schools (which generally mean high property taxes but are an indicator for a smarter, engaged populace), amenities like parks and places to hike, decent bandwidth, as well as independent restaurants, shops, and cultural attractions. There will always be variations in taste: some people really do want to eat at Cheesecake Factory and live in a Toll Brothers McMansion, but these will appeal to relatively few of the people fleeing cities at this point. Thus, the Hudson Valley—full of older, more interesting architecture, great natural resources and quirky towns—is booming. I predict some reversion to toward the mean after the pandemic ends and some of the people who fled to the country realize they aren’t suited to a place without Soulcycle, but this will be only a partial and temporary reversion.
I predict that even after the pandemic ends, there will be a greater interest in self-sufficiency among young people who move to suburbia and exurbia. Manicured laws will be less important than vegetable gardens. Homesteading, permaculture, and a drive back to the land not seen since the 1960s are under way. It would be a very good thing if the next generation was more in touch with their land and less prone to hiring “landscapers” who treat properties as sites subject to industrial interventions such as chemical fertilizer for lawns, a phalanx of gas-powered lawn mowers and leaf blowers to remove any stray biological matter.
As far as cities go, the pandemic is triggering a necessary contraction. The massive annihilation of real estate value it has caused should go a long way to undo the foolish notion that urban real estate is always a great investment. It’s not, just ask anyone who bought a house in Detroit in 1965. Real estate in first and second tier global cities has become wildly expensive, disconnected from the underlying fundamentals. When individuals are paying rents that absorb over 30% of their salaries to investor-owners who are not covering their mortgage with those rents, something is very wrong. This broken system has been able to function due to the perceived hedonic value of restaurants, bars, and cultural events, but these things too have been failing over recent years. Long prior to the pandemic, the cost of rent decimated independently-owned restaurants and retailers, with the latter also hurt by on-line shopping. The golden age of dining out (if it really was the golden age… I would say that better food could have been had in other, less copycat eras) was already declared over in 2019. “High-rent blight,” in which entire streets’ worth of storefronts were empty due to ludicrous rents, has been common for some time now. Tourists made up more and more of the street crowds while loss-leader flagship stores for chains like Nike and Victoria’s Secret replaced local businesses. With the hedonic argument for staying in the city rapidly disappearing, it was only a matter of time before individuals began departing and, in New York, population had begun to drop by 2018 (see more on all of this in Kevin Baker’s piece for the Atlantic, “Affluence Killed New York, Not the Pandemic”). Perversely, this is a good thing as it will likely lead to a bust in commercial real estate prices and a decline in unoccupied or AirBNB’d apartments, thus making global cities like New York places that have potential again. Moreover, many second tier cities such as St. Louis, Kansas City, and Cleveland are experiencing new growth as individuals able to work remotely are looking for places that are less expensive—and thus have more potential—than New York or San Francisco.
These shifts are huge and for the better. As I tried to tell my colleagues at the university, there is no housing crisis, at least not in the US and Europe, there is only an appearance of one because of the uneven distribution of housing: a glut in some areas, a shortfall in others. The pandemic has likely undone this a bit. Of course, places that are too politically Red, too full of chains, too full of copycat McMansions are unlikely to come back anytime soon, if ever. The Jackpot continues.
Still, I’m observing a perversely rosy future for the urban (and suburban and exurban) environment is the Biden administration’s interest in infrastructure. Back in 2008, I shocked design critics when I stated that there would be no progress in infrastructure for the foreseeable future. “But, Obama,” they complained. “But, Obama,” I clapped back, “just appointed Larry Summers as his chief economic advisor and Summers will bail out the banks, not fund infrastructure.” I expect the opposite from Biden who has adopted a “nothing left to lose” position as purportedly one-term President, is a devotee of train travel and is eager to make great progress on climate change. Appointing Pete Buttigieg, one of his two smartest opponents in the primary (the other being Andrew Yang, of course), to Secretary of Transportation is a key move. This will be Buttigieg’s opportunity to prove himself on the national stage and he will fight hard to do that, just as Biden expects. Expect more electrification across the board and, I suspect, more advances with self-driving vehicles. Although certain measures—such as, in the New York City area alone, the Gateway Tunnel between New Jersey and New York, now delayed over a decade thanks to Chris Christie and Donald Trump’s vindictiveness against commuter communities that would not vote for them and the reconstruction of Port Authority Bus Terminal—will help cities, again, I predict more emphasis on decentralization and activity outside the city.
All this may have salutary cultural implications. The global city is played out. Little of interest happens in New York, San Francisco, London, Paris, or Barcelona. These cities are too expensive for the sort of experimentation that made them great cultural centers and the diffusive nature of the Internet, capitalism, and overtourism have made them all the same. Residents of cities that have been victims of overtourism have seen this as an opportunity to reset, while the physical isolation of cities is going to increase reliance on local institutions. With some luck, all this leads to a new underground, with greater difference creating greater diversity and potential. Of fashion, Bruce Sterling writes, “Fashion will re-appear, and some new style will dominate the 2020s, but the longer it takes to emerge from its morgue-like shadow, the more radically different it will look.” The same could be true of all culture. Globalization was an incredibly powerful force but has been played out. I don’t agree with the protectionist instincts of the Trumpenproles but today culture’s hope is to thrive on the basis of the difference between places and cultures, not on greater sameness. Architecture has been very slow to react to all of this, in part because many intelligent young people have drifted into other fields, like startups, but I am optimistic that we might soon get past the ubiquitous white-painted brick walls and wood common table (the architecture of the least effort possible, to match fashion and food driven by the least effort possible), the tired old Bilbao-effect, and quirky development pseudo-modernism.
So much optimism on my part! Even I am shocked that I am so positive. But why not? The end to this exhausted first phase of network culture is overdue. Time for a new decade, at last.
*The reason for this is that there is no Year Zero. 31 December 1BC is followed by 1 January 1AD.
Recently, a friend mentioned that she believed “natural philosophies” would become more popular in this decade. I responded by pointing out that AUDC’s Blue Monday is subtitled Stories of Absurd Realties and Natural Philosophies. Here is an excerpt from the introduction that gets to the point of what a natural philosophy is and why we employed it.
We are certainly attracted by Deleuze and Guattari’s organizational logics and Hardt and Negri’s tales of Empire as well as Baudrillard’s theories of totalized consumption of the sign and the system of objects, Žižek’s fictionalization of the real, and Jameson’s dissection of late capitalism. These are fascinating explanatory frameworks, but, scandalously, they don’t add up. To weave a coherent whole out of them would be sheer madness. So how to proceed?
During our research into the evolution of art, science, and philosophy we found that these fields were once much more intimately related than they had been in the last century. Prior to the Enlightenment and the development of the scientific method, science was dominated by natural philosophy, a method of studying nature and the physical universe through observation rather than through experimentation. Virtually all contemporary forms of science developed out of natural philosophy, but unlike more modern scientists, natural philosophers like Galileo felt no need to test their ideas in a practical way. On the contrary, they observed the world to derive philosophical conclusions. Taken together, these didn’t necessarily add up. But the lack of metanarrative for natural philosophy is not an obstacle for us, instead it is a strength.
Natural philosophy flourished from the 12th century to the 17th and with it did cabinets of curiosities, sometimes entire rooms, sometimes quite literally elaborate cabinets, filled with strange and wondrous things. These first museums collected seemingly disparate objects of fascination in a specific architectural setting, assigning to each item a place in a larger network of meaning created by the room as a whole. In the cabinet each object would be a macrocosm of the larger world, illustrating the wonder of its divine artifice. Together however, their affinities would become apparent and a syncretic vision of the unity of all things would emerge, as the words Athanasius Kircher inscribed on the ceiling of his museum suggested: “Whosoever perceives the chain that binds the world below to the world above will know the mysteries of nature and achieve miracles.” For the natural philosopher, the cabinet of curiosities possessed a reflexive quality: it was both an exhibition and a source of wonder, a system the natural philosopher built to instruct others but also to coax himself into further thought.[i]
This book, like our web site or an AUDC installation, is a cabinet of curiosities, consisting of a series of conditions that AUDC observes in order to speculate on them in the manner of natural philosophy, extrapolating not theories that could then be applied to architecture but rather philosophies to explain the world. The result is neither the relativist pluralism nor a single monist philosophy, but rather a set of multiple philosophies that almost add up, but being situationally derived, don’t quite.
[i] Patrick Mauries, Cabinets of Curiosities (New York: Thames and Hudson, 2002), 25-26. Some further sources on cabinets of curiosities are Lorraine Daston and Katherine Park, Wonders and the Order of Nature (Cambridge: MIT Press, 1998) and Giorgio Agamben, “The Cabinet of Wonder” in The Man Without Content (Stanford: Stanford University Press, 1999), 28-39.
What of art in this plague year? With our isolation compounded by unprecedented political polarization, is there any hope of exploring the commonality of shared experience through art? As many of us head into our lockdown again, we are all too aware that the implementation of “social distancing” has produced a transformation in our society, unprecedented within our lifetimes; how might art reflect on such processes?
Consider the term “social distancing.” This peculiar phrase is currently being used to refer to physical separation between individuals in order to reduce exposure to viruses, but prior to this year it had a long history in sociology, referring to social separation among individuals and groups. Like many sociological concepts, social distancing originates with the foundational writings of sociologist Georg Simmel. In his 1903 “The Metropolis and Mental Life,” Simmel describes the growing distance between individuals as a consequence of modernization and the spatial condition of the city:
This mental attitude of metropolitans toward one another we may designate, from a formal point of view, as reserve. If so many inner reactions were responses to the continuous external contacts with innumerable people as are those in the small town, where one knows almost everybody one meets and where one has a positive relation to almost everyone, one would be completely atomized internally and come to an unimaginable psychic state.
Social distance, then, develops when physical distance is not possible and serves as a means of protecting the psyche. Nor is it done alone. On the contrary, social distance is frequently experienced communally: modern individuals flock with others of their kind and, in doing so, experience distance from those outside the group, those that they perceive as unlike themselves. For Simmel, the stranger is both near and far from us: close enough to provoke anxiety yet isolated and therefore unknown. Hence, this sociological explanation would lead us to understand racism as a product of social distance. Racism, it follows, is not so much a prejudiced encounter with individuals at distant remove, but rather a violent misperception of individuals in close proximity. The increasing polarization of American politics between the “woke folx” and the “deplorables” is further evidence of this, with both groups existing as a defense against a dangerous Other while providing a mechanism for reinforcing their member’s shattered identities. With the dominance of contemporary life by social media, the wounded self becomes a reposting machine for social media outrages as a means for the self to affirming its identity and existence through repetition, much as in a liturgical response. But being based entirely on the external threat of an Othered group, the result is a malfunction of social distance as psychic defense. Not only does political discourse grind to a halt, the self fails to develop as an individual. Facing increasing reliance on social media for human connection leads us toward a feedback loop of dopaminergic rewards for positive social stimuli and, like a chimpanzee in a lab cage punching buttons for cocaine-laced biscuits, we are led closer toward the “unimaginable psychic state” of the completely atomized dividual.
This sociological concept of social distance initially appears quite different from pandemic quarantine. And yet, it’s no coincidence that political and racial tensions have risen to a fifty-year high in the United States: the social distancing of quarantine and the social distancing of the modern city share similarities. The chance encounters of the street, the park, and the pub have largely ended. Endless Zoom meetings with one’s co-workers or classmates reinforces positions in class structures. Reading the news about groups that egregiously flaunt restrictions angers us and sets us against them. Our isolation leads us to become more addicted to social media and as social media algorithms highlight our posts to those people who “liked” our previous posts, we condition ourselves to the rush of “likes,” of feedback from those people who already give us feedback——creating a false sense of belonging and an experience of growing distance from those unlike ourselves. So, then, back to the original question I posed: is there any hope of exploring the commonality of shared experience through art?
Early telematic art offers a direction for us. Forty years ago Kit Galloway and Sherrie Rabinowitz, working as the Mobile Image group, created a series of projects entitled “Aesthetic Research in Telecommunications,” both anticipating and surpassing the videoconferencing that has come to be identified with life under quarantine. In Satellite Arts: A Space With No Boundaries(1977), they connected dancers located three thousand miles apart via low-latency satellite links. Although at times the dancers interacted on split screens, as we might see friends interacting on Zoom or Facetime, Galloway and Rabinowitz also composited them together on the screen visible to the audience, thereby creating a third, virtual space. In Hole in Space (1980), they created wormholes to bring together people from opposite coasts. Unlike our screen-based videoconferencing, Hole in Space employed life-size screens displayed in public spaces (Los Angeles’s Century City Shopping Center and New York’s Lincoln Center respectively). Deliberately unadvertised, Hole in Space was as they put it, “a social situation with no rules,” a new condition that, as a documentary of the work shows, was profoundly moving to the people who participated. In a third project, the Electronic Cafe of 1984, they connected five restaurants—in Korean, Hispanic, African-American, and beach communities—with video equipment and drawing tablets so that individuals in one venue (including artists-in-residence) could exchange ideas and images with unknown individuals in another. Throughout, Galloway and Rabinowitz approached electronic technology as a means of overcoming both physical distance and the social distance between individuals.
It’s possible to imagine what a new telematic art might look like today, outside of the confines of conventional videoconferencing. For example, already yin 2004, AUDC (myself and Robert Sumrell) proposed creating guerrilla versions of the Hole in Space cartable in a backpack that could be deployed at random throughout a network of cities.
At the time we wrote:
Windows on the World is a formulary for a new urbanism that alleviates boredom with the city and encourages communication in public, rather than private settings. It facilitates open, spontaneous, and democratic exchanges between groups while requiring no special skills to operate. Participants share both their differences and similarities through direct interaction, replacing the myth of global hegemony and projected stereotypes with personal experience.
Windows on the World remaps our cities to create a telematic dérive. Guy Debord’s map “the Naked City” is now a map of the world. Each portal becomes what the Situationists called a plaque tournante, a center, a place of exchange, a site where ambiance dominates and planners’ power to control our lives is disrupted. People can wander from portal to portal within their city, experiencing and constructing different situations. Windows on the World changes the experience people have not only of the world, but of their city.
Like the Situationist dérive, Windows on the World operates outside of commerce and planning. There is no advertisement. The project is at its strongest when it is by chance. Some portals are temporary, even hidden. Others are improbable, or not easy to access. In a back alley in Prague, shaded by a Northern exposure, is a portal to a zoo in Sao Paolo. From a dangerous street in the Bronx, a door opens onto the Champs-Elysees. Another portal, in Zurich, looks out onto a busy railroad yard in Rotterdam.
Sixteen years have passed and the widespread adoption of social media during that time has led us to confuse the public and the private. We invite our friends and even strangers into our living rooms and our bedrooms. Formality has fallen away as suit and dress have given way to sweatpants and leggings. But there is no comfort to these forms of sociality or dress. Instead, we understand ourselves to always be on display, to have to perform at all times to receive our dopamine rush both for social interaction and for work, which is ever more subject to bureaucratic projects like “agile frameworks” and “total quality management” offering micro-rewards and micro-punishments for meaningless tasks completed. With this new search for (over) stimulus, there is every chance that a project like Windows on the World will fail as individuals provoke each other like chimps in separate cages at a zoo (or as humans do on Chatroulette). But thinking through how social connections and an appeal to a universal human connection might be made today—however obliquely—is an imperative task for art and technology.
Work in this vein certainly does not always need to provide an affirmative alternative space. Returning to my own work at the Contemporary Art Centre in Vilnius, Lithuania in 2016, the Perkūnas project aims to make us sonically aware of the social distancing produced by network culture. Visually, this structure is both abstract—a simple architectural form—and a harbinger from another world, hence the name Perkūnas [the Lithuanian name for (the God of) Thunder]. Built of large ventilation ducting commonly found (and heard) in art museums and other large public structures, Perkūnas’s role isn’t to condition the air in the environment, rather it is to produce sound in accordance with the number of Wi-Fi enabled devices in the room.
A microprocessor monitors the space for active Wi-Fi-enabled electronic gadgets, increasing or decreasing the amount of air flowing through the duct and hence, the amount of noise in the room based on the number of gadgets. If there are no gadgets present or if everyone’s Wi-Fi is disabled, the duct makes little or no sound. With a couple of gadgets, it make a louder sound. The more gadgets, the more sound. Our ability to communicate verbally is directly affected by the amount of gadgets. If we leave our gadgets behind or turn them off, Perkūnas stays quiet. But people don’t read wall text, they look at their phones, distracted by whatever algorithm is guiding them along their day and Perkūnas howls at them, their distraction drowning out their ability to communicate verbally.
Perkūnas doesn’t bring people together, but it suggests one direction for an art that can speak frankly to us about our social media-obsessed world without resorting to parody or false political promises about participation. Finding new ways for art to negotiate the space between social media and social distancing is a task at hand.
After the election of the Tyrant, a strange thing happened to me. I found my drive to make art and to write about culture evaporate. This didn’t happened all at once and, for a time, I kept up a certain level of production, but I noticed my disdain for these activities rising. Somehow, producing for a failed culture seemed wrong. Instead, as I’ve chronicled here, I turned inward, taking on a variety of projects at our house, but also turning to radical gardening with native plants as a process of healing a small piece of land much damaged by human disturbance (notably construction of this property forty years ago) together with the violence of contemporary landscaping. It’s been incredibly valuable to practice radical gardening but it’s also a matter of temporal displacement: the work I have done won’t be visible for years to come: it’s an investment in a better future, optimism that this nightmare would end.
As for art, I continued research as a kind of secret activity, but I’ve definitely felt the need to retreat. At some point, I decided that I’d consciously go on an unannounced art strike while the Tyrant was in office. I couldn’t find a way to return to work that has been, at its basis, a sustained inquiry into how we behave to each other in the public realm. I’ve talked to a few friends about this and apparently this isn’t uncommon at all.
Even if I don’t expect the political polarization crippling the US and the world to stop, with the Tyrant in his twilight, playing golf as performance art and flailing aimlessly about, there’s no question that a certain weight is gone. The world is still damaged, but it’s easier to sleep at night again, there’s room again to breath. If there’s another lockdown—and there should be—I’ll be here, taking notes and experimenting.
Writing and research are both in the works and you’ll be seeing evidence soon.
This month of rage has been building for a while: the last few weeks, the last three years, are nothing new, neither for the country nor for myself. A tyrant sits on his throne, spreading his bigotry and inciting hatred. Black men and women are dying in the streets, killed by not so much by individual crooked cops as by a malignant system central to the disciplinary society that underlies our government.
Nothing new here. That’s not a way of ducking the issue, that is the issue. What follows is a highly personal account that I am set out to write to underscore just how deeply fucked up the discipline of architecture is.
Over thirty years ago, on August 6, 1988 I witnessed the peaceful Tompkins Square Protest against neighborhood gentrification turn into a police riot after cops charged the crowd wielding batons. Police claimed that bricks and bottles were thrown, but I was there, they lied. Cops lie, it’s a fact that my father—a conservative and fan of Ronald Reagan—taught me years before that. The cops charged us. We ran. As we inched our way back to the scene, I watched a young black man in dreadlocks being held by two cops on horses as they hit him over and over with their batons. I was fairly far away and I gauged the scene. I yelled at them and gestured obscenely, hoping they would chase me, giving the man some relief. I figured I’d duck into a storefront. Had they come after me, I am sure it would have been bloody, but they didn’t. They continued beating him. Later, the police would be charged with over a hundred counts of police brutality. Only two officers were actually found guilty of any wrongdoing and only one—conveniently for the NYPD, a woman—was dismissed.
I turned 21 that year and I was in New York because I was supposed to go to Columbia for graduate school in architecture, but after three days I’d had enough and I quit, going back to Cornell to do a doctorate in the history of architecture. But these two moments aren’t random bits of biography, the first led to the second. I’d majored in history of architecture for two years at Cornell and as part of that, I took design studio as well. As was the goal of that curriculum, design studio taught me more than anything else during those two years. It seemed to me that there was something fundamentally wrong with the formalist system of education as it was taught at Cornell, where I studied, as well as at so many others schools, including Columbia.
I tried to find out what was behind this system and, since the first project we worked on was the Nine Square Grid, I eventually traced it to a book that was curiously missing from Cornell’s Fine Arts Library and could only be found via interlibrary loan, the 1971 MoMA catalog, The Education of an Architect: a Point of View, which chronicled the teaching of John Hejduk at Cooper Union as well as the book that every student seemed to hide in their desks like a pornographic magazine, the 1975 Five Architects, which presented the work of Peter Eisenman, Michael Graves, Charles Gwathmey, John Hejduk, and Richard Meier. These books were to be read furtively since they suggested that reading, not learning through drawing, might be possible and that the system of education then being employed, which presented itself as timeless, might itself have a history.
I found it difficult, almost impossible, to square these books—filled with formal games and an architecture that defied materiality—with the real political revolution of the 1960s. How could they co-exist in the same milieu? But clearly, it was there, in his preface to Five Architects, Arthur Drexler, Director of Architecture and Design at MoMA, stated “An alternative to political romance is to be an architect, for those who actually have the necessary talent for architecture.”
In 1988, bound for architecture school, I witnessed the violence in the streets and a month later, decided that it was not the time for me to go to architecture school. I wasn’t just interested in understanding the way architecture repressed politics, I needed to understand it. Against all advice, I chose to try to understand the very educational system that I detested.
As I did so, I ran across two curious articles in, of all places, Spy Magazine. Spy, edited by Kurt Andersen, was this wild, satirical magazine that mercilessly targeted the celebrity class, most notably the “short-fingered vulgarian Donald Trump,” who remembered that slight well enough that he tweeted about it in 2015 when Charlie Hebdo‘s offices were attacked. The first was a 1991 article by John Brodie, “Master Philip and the Boys.” Brodie examined how Philip Johnson wielded power and influence in architecture and how he gathered “the Kids,” a boys’ club of formalist architects around himself (curiously enough, this included three of the five members of the New York Five plus Robert Stern and Frank Gehry). The second was the late Michael Sorkin’s 1988 article, “Where was Philip?” (thanks, Trump, you fucking fuck, for dragging your feet on effective measures against the COVID-19 virus and thereby killing Michael Sorkin, much better a man than you or your awful children will ever be). Using primary sources, Sorkin uncovered how the very same Philip Johnson tried to create a fascist party in the United States, going so far as to accompany the Nazis on the blitzkreig into Poland, singing the invading army’s praises in (what we now call fake) news articles for extreme Right-wing publications. This led to that again: I realized these three phenomena—a formalist architectural education, an aged architect who gathered power around himself, and the largely-supressed Nazi past of that architect—were not distinct, but were all aspects of the same system.
I spent three years writing a dissertation exploring how contemporary architecture was fundamentally based on a process of spectactularization—of stripping politics from history for the purposes of an illiberal politics of form. I was writing in the aftermath of the scandals about Paul de Man’s collaborationist writing and I was naïve enough to believe that the academy would take my work to heart and examine itself. Matters had already changed by the time I filed my dissertation when Franz Schulze’s published his biography of Johnson. I hadn’t known Schulze was writing the biography, or even doing research into Johnson’s fascist period until right before publication, so his work hardly impacted mine, apart from some last minute revisions. Schulze’s book was a bizarre and offensive combination of pinkwashing—Johnson loved the boys in their black boots and hot uniforms so it was all ok—and old school outing (Johnson was gay! he was gay, imagine the scandal!). Schulze, who demonstrated no interest in critical thought, either didn’t have the capacity to make the connection between the postwar whitewashing of Johnson’s Nazi past and the concomitant removal of both Left and Right politics from modern architecture or more likely, chose to ignore it. These were the same things: they both served power, privilege, and a Nietzschean drive to form at all costs. If Schulze didn’t draw this connection, I would soon find out that the academy didn’t want to hear it.
I published two articles from my dissertation in the Journal of Architectural Education. Editor Diane Ghirardo greatly supported this work, as did the peer reviews I received. The first article was “We Cannot Not Know History,” in which I examined on Johnson’s Nazi past and the attempts to repress it. This immediately got me in hot water with the JAE’s parent organization, the Association of Collegiate Schools of Architecture. Whether the directorate was interested in killing the article because—as they claimed—it opened the organization to libel lawsuits or whether they hoped to kill it because the article hit too close to home is up to you to decide. The second article was “The Education of the Innocent Eye.” My exploration of the formalist system of architectural education was damning as well—after all, I concluded, “An initial exposure to architecture through the absolute system of the visual language of architecture clearly puts materialist questions second. Beam, column, wall, compression, shear, and rotation take precedence, and history, theory, gender, race, and class take a backseat.” This time, quashing the article wasn’t in the cards and I won Best Article of the Year from the JAE.
But I’d already found that not only was architecture not willing to tackle the questions I’d raised, my dissertation topic was, on the contrary, an impediment to a career in the academy. Granted, it was the recession, but I spent two years unable to find employment until Margaret Crawford gave me a chance to teach history and theory of architecture at SCI_Arc. Even then, my work was received with suspicion by virtually all of the design faculty there and many of would have gladly curried favor with Johnson if given the opportunity. In the end, upon becoming Director, Eric Moss asked me in an accusatory tone what I was hoping to gain from writing this sort of thing. No great surprise, it was well known that he was a wannabe Kid. He made it clear I was unwelcome in his regime and his leadership style (so clearly prefiguring Trump’s presidency in its vulgarity and absolutism) were such that I didn’t want to work for him in any event.
In the meantime, I’d hoped to publish my dissertation in book form. Theory was rapidly falling out of fashion and there weren’t many takers for my work on architecture education. What still shocked me was how reluctant publishers were to publish my material on Johnson. Encouraged by some colleagues (notably Mike Davis), I put together a proposal to publish my work on Johnson together with the texts he wrote back in the 1930s. Unfortunately, these publishers had no interest in helping architecture confront itself. One asked, like Moss, “what I hoped to gain from all this.” (I started posting these articles in 2017 but the silence sapped my spirit…I’ll try again another day and see if there is any greater interest this time).
I taught as a visiting faculty member at the University of Pennsylvania for a term under the late, brilliant and gentle Detlef Mertins, but work there was only temporary. Clearly it wouldn’t have been possible for me to stay in the history of architecture—my work was too dangerous for a powerful figure in that program, friends confided in me (I suspected having a non-anglo-saxon name was also a problem for this fellow… as it has been in so many places throughout my life)—and that was the last time I was a full-time member of a history of architecture faculty in the US. I reinvented myself: if my dissertation had been around social networks, I now looked to theorize the impact of digital networks on cities and forged a career in that field. I managed to land a position in the laboratory research wing of the architecture school at Columbia that lasted well over a decade until it was eliminated by a new Dean seeking to cut costs. Even so, it was clear at Columbia that my presence as a historian was deeply questionable, not to be publicly acknowledged and I was never asked to be part of that faculty. Architecture still can’t stomach critique, this is clear to me.
I’m out of teaching, likely for good. I knew that this work was too dangerous for architecture and I had a backup plan for funding that paid off at just about the same time that the labs were shut down at Columbia so now I can work on other projects that interest me more. I’m much happier without the phony leftism of the university, of faculty who pretend to be Marxists, but whose real goal is defending their own turf and the system.
I have a lot of (political) gardening to do, I need to make some art, and look over a translation of one of my texts, but along the way, I intend to go back to some of these projects and post material from them on this site. Let’s see if there’s any interest this time. I’m not confident about it.