In his Theory of the Avant-Garde, Peter Bürger concluded that the avant-garde’s purpose is for art to sublate(assimilate) into life. In opposition to nineteenth century aestheticism that aimed to emphasize the autonomy of art from life, Bürger’s reading of the historical avant-garde—be it Dada, Surrealism, Productivism, Constructivism, or the Bauhaus—was that it aimed to break down the barrier between art and life, allowing the fullness of artistic expression to pass into all aspects of life.
For a large group of people in the developed world, this is now an everyday condition. Members of the creative class curate their lives around aesthetic choices, work and life are inseparable. Our lives are filled with intentional choices that express our individuality: we aspire to cook modernist cuisine, clean up with Marie Kondo, and obsess over the right boots and hat to go gardening in. STEM and maker culture are not opposed but inseparable: who doesn’t make their own jewelry or design their own body art these days, often using 3D modeling software and printers? Tens of thousands of people worldwide sit on Philippe-Starck-designed toilets every day. The workplace is a playground. Even after the recent plague, design festivals and biennales are a dime a dozen now. Go glamping in Marfa, spend an evening at the local sip ‘n paint, bring your friends to the immersive van Gogh. This curated life is thoroughly documented, to be posted on Instagram for the world to see.
In fairness, Bürger believed that by the 1950s, when avant-garde techniques from Dada and Surrealism had been incorporated into advertising and television (think Ernie Kovacs or Ray and Charles Eames’s films here), the aestheticization of everyday life had been complete and the avant-garde had been dealt a fatal blow. For Bürger, this is a false sublation, but I’m twice as old and jaded as I was when I first read the Theory of the Avant Garde and I don’t see how Bürger’s historical avant-garde could have ever been anything but a temporary reconciliation with an ultimate tragic end. The avant-garde was always a historically delimited moment. And if it’s fair to say that contemporary culture is thoroughly spectacularized, you would be right, but when a book on Constant Nieuwenhuys sells for $1,892 on Amazon, what is the spectacle anymore? Writing about Situationism has earned more than one professor tenure at a top university. Pinot Gallizio’s works, once sold by the yard, now sell for tens of thousands of Euros. The practices of Situationism have long since been absorbed by the spectacle. What is Dîner en Blanc® if not a Situationist practice? What is Situationism if not an excellent guerrilla marketing project?
That the Situationists or Fluxus chose to continue on with the neo-avant-garde was merely an after-effect. No doubt there is much truth there. The historical avant-garde is long dead and with it too the promise of art sublating into life.
Much of the art world has long abandoned any pretense of avant-gardism, embracing instead the idea of self-validation and value. Take NFTs, the realm of garish cartoon apes that have escaped from a Hot Topic store to scream “I am rich.” This is no different from the art at the very top of the market, touted as an investment vehicle that cuts out the vicissitudes of corporate ups and downs, skipping price/earnings ratios and dividends for an unabashed belief in inflation and the greater fool theory, but in reality act primarily as a signifier for extreme wealth and good taste (and often a front for money laundering).
Other forms of art and architecture use politics as a form of branding, taking a page from Debord’s idea of the Spectacle. Take the hyper-branded architecture of Rem Koolhaas, Bjarke Ingels, Diller, Scofidio + Renfro and their ilk, often presented by academic “critics” as somehow serving to liberate people (which I suppose means from architectural convention) or progressive (which is just baffling).
These two positions—the idea of self-validation and branding—come together in art that espouses a political position or identity politics. Now the central point of the avant-garde had been to communicate political ideas and, especially after the Black Lives Matter and #metoo movements, there has been a burst of interest in the art world in such art. Yet, nobody has ever gone to an art gallery and come out a communist. Hedge fund manager Daniel Loeb collects art by Jean-Michael Basquait, Richard Prince, Mike Kelley, and Cindy Sherman, all of whom have been political art darlings of Leftist art critics and yet is a major donor to Right-wing causes as well as a supporter of the neo-fascist menace that occupied the White House from 2017 to 2021. He is merely one egregious illustration; ultimately one’s political position hardly matters. What does it mean to have an El Lissitzky on one wall and a Frida Kahlo on another? It signifies wealth and aesthetic appreciation, not political allegiance. What does it mean to demonstrate solidarity with an identity group? Why is one lauded for affirming one’s sexuality loudly in art, even if Mapplethorpean transgression can no longer demonstrate the shock of the new? All this merely demonstrates one’s virtue.
Many members of the bourgeoisie, unable to escape the deeply engrained notions of Protestantism, but questioning its superstitions, have replaced the delusion of original sin with the notion of “privilege.” Surrounding oneself with art that trumpets the identity of its maker is a way of assuaging this guilt, even if—as the notorious Whitney “Collective Actions” show demonstrated—political art’s functional purpose isn’t to change the structural condition that it critiques but rather to underscore and cement those very structural conditions. Nor is this new, notwithstanding the newness of the phrase “virtue signaling,” virtue and art have long been linked, initially through religion, later on through connoiseurship. And, of course, for many artists, the idea that art needs to be socially relevant assuages their own guilty consciousnesses for producing useless things for the rich.
And yet, as Peter Sloterdijk explained in his Critique of Cynical Reason, it’s the habit of such guilty consciousnesses to turn to cynical. The cynic (in the sense that Sloterdijk and I always speak of) is someone with an enlightened false consciousness, someone who knows that something is wrong but goes on doing it anyway. Having read critical theory in university, the modern cynic knows that what she or he is doing is wrong, but they do it anyway. Sloterdijk writes that this makes them “borderline melancholics, who can keep their symptoms of depression under control and can remain more or less able to work.” For Sloterdijk, once an individual has become cynical, his or her hope has been lost, abandoned for expediency. Take for example, the Marxist professor (a figure I met all too often in the university) who realizes that with Revolution endlessly deferred, the best thing they can do is to defend their academic position at all costs so they can continue preaching Adorno and Benjamin, even if that defense comes at the cost of cutting down rising faculty, avoiding any political activities outside the university, or looking upon staff as human beings worthy of consideration. Fascism—both interwar and present-day American and European fascism—is the ultimate result, of course, a politics based on brutal expediency, in which democracy must ultimately give way to a “politics of pure violence.”
There is, however, a choice that avoids the cynical, the choice of embracing the most degraded of all ideas in art today, that of “the universal,” and it may not be what many of you will think or find acceptable (although in private conversations, many of you have said that this is precisely what is necessary…). That possibility is the subject of Part II, which will come after an interregnum in which I get some work out there.
According to conventional chronological schemas, 2020—not 2019—is the last year of the 2010s.* This is convenient since, as I pointed out in last year’s premature review of the last decade, the 2010s were “the decade of shit” and 2021 is a stinking pile of shit. The worst decade since World War II ended with the worst year since 1945.
My “year in review” posts are usually almost as late as my taxes and when I finished last year’s post on February 12, we were all well aware that COVID was out there. Now, no question that I missed the severity of the pandemic back then, but I was on the money about its psychic effects. For all of the horror of COVID, it isn’t horrible enough. COVID is banal. Instead of bleeding out through all of our orifices as with Ebola, COVID is “a bad case of the flu” that leaves people dead or with debilitating cardiovascular and neurological ailments. But how different is my diagnosis, really, from what happened?
Now sure, this year  we’ve already had firestorms the size of Austria ravaging Australia, a rain of rockets in Baghdad, Ukrainian jetliners getting shot out of the sky, a deadly pandemic in China caused by people eating creatures that they really shouldn’t, and the failure of the Senate to uphold the rule of law, but the banality of it all is crushing. While the Dark Mountain set drinks wine around a campfire, gets henna tattoos, and sings along to songs about the end of nature, for the rest of us, it’s just an exhausting, daily slog through the unrelentingly alarming headlines.
COVID brought us yet more crushing banality. The Idiot Tyrant is gone, but we are trying his impeachment yet again. Everything changes, but nothing changes. We were all the Dark Mountain set this year, sitting around our campfires, singing songs about the End. It was another atemporal slog, one day bleeding into another, every day a Sunday in a country where everything is closed on Sundays and there is nothing to do, every day stranger and more disconnected than the last, something captured in comedienne Julie Nolke’s series of videos entitled “Explaining the Pandemic to My Past Self.”
Amidst the disconnection, the Jackpot—or William Gibson’s term for a slow-motion apocalypse—cranked up a couple of notches. Just surviving the year was an accomplishment. The balance of life has been thoroughly disrupted and that disruption isn’t going away any time soon. It’s not just COVID: we now feel certain that there will be more pandemics, more massive wildfires, and more superstorms in our future. The Earth isn’t dying (sorry, climate doomers), but there will be huge losses of species worldwide, human population decline is well underway in advanced societies (the US is finally on the bandwagon here), and massive deaths will take place across the planet until the population comes back to a sustainable level decades from now.
But the premise of the Jackpot is that it isn’t a final apocalypse: there will be another side. In his Twitter feed (@GreatDismal), even Gibson focuses on the horrific and unjust nature of the Jackpot, but there will be winners, selected on the basis of wealth and sheer dumb luck. What might this say about the US election and the fact that 46% of Americans voted for a cretin? Now, there is nothing particularly new about melding Tourette’s and dementia into a public speaking style, there are plenty of lunatics sitting on their porches screaming obscenities at their lawn ornaments. Everybody knows that Uncle Scam’s persona as a billionaire—or rather the King of Debt (his own term!)—is an act. The man with the golden toilet is not a successful businessman. He is weak, a loser who can’t stay married or stay out of bankruptcy court. Four years of misrule ended in abject failure: defeat in both electoral and popular votes, being banned from social media and, with his businesses failing, being forced out of office in shame to face an unprecedented second impeachment, an array of civil litigation as well as criminal indictments for fraud, tax evasion, incitement to riot, and rape. But this—not a misguided notion of him as a success—is the real point of his appeal. The short-fingered vulgarian is a life-long loser, a reverse Midas whose every touch turn gold to lead. But in the face of the Gibsonian Jackpot, his appeal was not as a stupid version of Homer Simpson, grabbing whatever scraps he can and, when that failed, LARPing as President, destabilizing society, and just blowing everything up.
LARPing was big in 2020, which saw the attempted kidnapping of Michigan Governor Gretchen Witmer by wingnut idiots, various insane protests by COVID deniers, the attempted coup of the Capitol Insurrection, and the riots developing after the Black Lives Matter protests. BLM was the standout among these, not only a good, just cause, but also because the majority of the protests themselves were peaceful—such as the one in our town of Montclair, New Jersey. None of that was LARPing, but the riots that accompanied it were. For the most part, this was less people with genuine greivances and more Proud Boys, Boogaloos, anarchists, and grifters who came in to loot and burn whatever they could down. Although there were kooky moments on the Left like the Capital Hill Automonous Zone, Antifa, for however much it exists, didn’t do much, certainly proving to be far less trouble than white supermacist-infiltrated police forces in paramilitary gear. Still, the widely-vaunted second Civil War never came about and the arteriosclerotic LARPers on the Right limped off the field in defeat after their they got a spanking at the January putsch.
A number of observers at both the Capitol Insurrection and CHAZ —including some of the idiots who took part in it—noted that these events felt much like a game, specifically an Alternate Reality Game (ARG). In a typical ARG, players look for clues both online—think of the QAnon drops, the Trumpentweets, or the disinformation dished out by the skells at 4chan, 8chan, and so on—as well as out in the world. Jon Lebkowsky, in a post at the Well’s State of the World and Clive Thompson over at Wired compare QAnon to an ARG. Indeed, gaming is taking the place of religion (whichever grifter figures out how to meld this with Jesus and his pet dinosaurs will get very rich indeed), with the false promise that playing the game and winning will deliver one to the other side of the Jackpot. Somewhere, I read that when asked what he would do differently if he had made Blade Runner a decade later, Ridley Scott replied that he would be able to skip the elaborate sets and just point the camera down the streets of 1990s Los Angeles. Today, the same could be said for the Hunger Games today.
But not everything was LARPing. If Cheeto Jesus is an icon for LARPing losers, Biden was elected on the premise of staving off the Jackpot by returning adults to the White House. This is not a bad thing, we might as well try. Still, from the perspective of Jackpot culture, the most interesting political development of the year was the candidacy of Andrew Yang whose cheery advocacy of Universal Basic Income (aka the Freedom Dividend) masked the dark, Jackpot-like nature of his predictions. Let’s quote Yang’s campaign site on this: “In the next 12 years, 1 out of 3 American workers are at risk of losing their jobs to new technologies—and unlike with previous waves of automation, this time new jobs will not appear quickly enough in large enough numbers to make up for it.” No matter how friendly Yang’s delivery, there is a grim realism to his politics, an acceptance that things will never be better for a massive sector of the population. Certainly some individuals will find ways to use their $1,000 a month freedom dividend as a subsidy to do something new and amazing, but 95% will not. Rather, they will form a new and permanent underclass as they fade into extinction. Again, the point of Yang’s candidacy isn’t the cheerleading for math and STEM, it’s the frank acknowledgement that the Jackpot is already here.
On the other hand, toward the end of the year, Tyler Cowen suggested that we might be nearing the end of the Great Stagnation (he is, of course, the author of an influential pamphlet on the topic) and you can find a good summary of the thinking, pro and con by Cowen’s student Eli Dourado here. In this view, advances such as the mRNA vaccine, the spread of electric, somewhat self-driving vehicles, the pandemic-induced rise of remote work, and huge drop in the cost of spaceflight are changing things radically and could lead to a real rise in Total Factor Productivity from the low level it has been stuck at since 2005. Is this a sign of the end of the Jackpot? Unlikely. That won’t come until a series of more massive technological breaks, probably (but not necessarily) involving breakthroughs in health (the end of cancer, heart disease, and dementia), the reversal of climate change, working nanotechnology, and artificial general intelligence. But still, there are signs that early inflection points are at hand.
Personally, we experienced one of these inflection points when we replaced our aging (and aged) BMWs with Teslas. I wound up getting a used Tesla Model S last January and then immediately turned around and ordered a brand new Model Y that we received in June. No more trip to the gas station, and while “Full-self driving” is both expensive and nowhere near fully self driving, it is a big change. Longer road trips—which under the pandemic have been to nurseries on either side of the Pennsylvania border to buy native plants—have become much easier, even if I still have to keep my hands on the wheel and fiddle with it constantly to prevent self-driving from disengaging. But harping on too much about the incomplete nature of self-driving is poor sport: in the last year, Tesla added stop light recognition to self-driving and a new update in beta promises to make city streets fully navigable. Less than a decade ago, self driving was only a theoretical project. Now I use it for 90% of my highway driving. That’s a sizable revolution right there. Also, the all-electric and connected nature of these cars makes getting takeout and sitting in climate-controlled comfort in my vehicle when on the road a delight. Electric vehicles were a big success this year and in our neighborhood which is a bellwether for the adoption of future technology (when I saw iPhones replace Blackberries on the bus and train into the city, I bought a bit of Apple stock and made a small fortune) and Teslas have replaced BMWs as the most common vehicle in driveways.
Back to the pandemic, which accelerated a sizable shift in habitation patterns. Throughout the summer, there was a lot of nonsense from neoliberal journalists and urban boosters about how cities are going to come back booming, but with more bike lanes, wider sidewalks, less traffic, and more awesome tactical urbanist projects to appeal to millennials. Lately, however, those voices have fallen silent and with good reason. In this suburb the commuter train platforms are still bare in the mornings and the bus into the city, once packed to standing room only levels every evening, hasn’t run in five months. A friend who works in commercial real estate says that occupancy in New York City offices is at 15% of pre-pandemic levels. Business air travel is still off a cliff. Remote work isn’t ideal for everyone and every job, but neither was going into the office. For sure, the dystopian open offices, co-working spaces, and offices as “fun” zones are done and finished. People are renovating their houses, or upsizing, to better live in a post-pandemic world of remote work. Another friend who works for a large ad agency told me that they did not renew their lease for office space and do not plan to ever go back to in person work, at least for the vast majority of the staff. When employees gain over two hours a day from not commuting and corporations save vast fortunes on rent, remote work seems a lot more appealing. Retail sales here and in the surrounding towns have gone through the roof, just as they have in many suburbs.
But it isn’t just suburbia that has prospered at the expense of the city, exurbia has returned too. Way back in 1955, Auguste Comte Spectorsky identified a growing American cultural class that he dubbed “the exurbanites” made up of “symbol manipulators” such as advertisers, musicians, artists, and other members of what we today call the creative class. Spectorsky observed that many of these individuals eventually tended to drift back to the city. This time may be different. After two decades in the city, the creative class is turning to places outside the city with attractive older houses and midcentury modern properties, walkable neighborhoods (virtually all of Montclair, for example, has sidewalks), good schools (which generally mean high property taxes but are an indicator for a smarter, engaged populace), amenities like parks and places to hike, decent bandwidth, as well as independent restaurants, shops, and cultural attractions. There will always be variations in taste: some people really do want to eat at Cheesecake Factory and live in a Toll Brothers McMansion, but these will appeal to relatively few of the people fleeing cities at this point. Thus, the Hudson Valley—full of older, more interesting architecture, great natural resources and quirky towns—is booming. I predict some reversion to toward the mean after the pandemic ends and some of the people who fled to the country realize they aren’t suited to a place without Soulcycle, but this will be only a partial and temporary reversion.
I predict that even after the pandemic ends, there will be a greater interest in self-sufficiency among young people who move to suburbia and exurbia. Manicured laws will be less important than vegetable gardens. Homesteading, permaculture, and a drive back to the land not seen since the 1960s are under way. It would be a very good thing if the next generation was more in touch with their land and less prone to hiring “landscapers” who treat properties as sites subject to industrial interventions such as chemical fertilizer for lawns, a phalanx of gas-powered lawn mowers and leaf blowers to remove any stray biological matter.
As far as cities go, the pandemic is triggering a necessary contraction. The massive annihilation of real estate value it has caused should go a long way to undo the foolish notion that urban real estate is always a great investment. It’s not, just ask anyone who bought a house in Detroit in 1965. Real estate in first and second tier global cities has become wildly expensive, disconnected from the underlying fundamentals. When individuals are paying rents that absorb over 30% of their salaries to investor-owners who are not covering their mortgage with those rents, something is very wrong. This broken system has been able to function due to the perceived hedonic value of restaurants, bars, and cultural events, but these things too have been failing over recent years. Long prior to the pandemic, the cost of rent decimated independently-owned restaurants and retailers, with the latter also hurt by on-line shopping. The golden age of dining out (if it really was the golden age… I would say that better food could have been had in other, less copycat eras) was already declared over in 2019. “High-rent blight,” in which entire streets’ worth of storefronts were empty due to ludicrous rents, has been common for some time now. Tourists made up more and more of the street crowds while loss-leader flagship stores for chains like Nike and Victoria’s Secret replaced local businesses. With the hedonic argument for staying in the city rapidly disappearing, it was only a matter of time before individuals began departing and, in New York, population had begun to drop by 2018 (see more on all of this in Kevin Baker’s piece for the Atlantic, “Affluence Killed New York, Not the Pandemic”). Perversely, this is a good thing as it will likely lead to a bust in commercial real estate prices and a decline in unoccupied or AirBNB’d apartments, thus making global cities like New York places that have potential again. Moreover, many second tier cities such as St. Louis, Kansas City, and Cleveland are experiencing new growth as individuals able to work remotely are looking for places that are less expensive—and thus have more potential—than New York or San Francisco.
These shifts are huge and for the better. As I tried to tell my colleagues at the university, there is no housing crisis, at least not in the US and Europe, there is only an appearance of one because of the uneven distribution of housing: a glut in some areas, a shortfall in others. The pandemic has likely undone this a bit. Of course, places that are too politically Red, too full of chains, too full of copycat McMansions are unlikely to come back anytime soon, if ever. The Jackpot continues.
Still, I’m observing a perversely rosy future for the urban (and suburban and exurban) environment is the Biden administration’s interest in infrastructure. Back in 2008, I shocked design critics when I stated that there would be no progress in infrastructure for the foreseeable future. “But, Obama,” they complained. “But, Obama,” I clapped back, “just appointed Larry Summers as his chief economic advisor and Summers will bail out the banks, not fund infrastructure.” I expect the opposite from Biden who has adopted a “nothing left to lose” position as purportedly one-term President, is a devotee of train travel and is eager to make great progress on climate change. Appointing Pete Buttigieg, one of his two smartest opponents in the primary (the other being Andrew Yang, of course), to Secretary of Transportation is a key move. This will be Buttigieg’s opportunity to prove himself on the national stage and he will fight hard to do that, just as Biden expects. Expect more electrification across the board and, I suspect, more advances with self-driving vehicles. Although certain measures—such as, in the New York City area alone, the Gateway Tunnel between New Jersey and New York, now delayed over a decade thanks to Chris Christie and Donald Trump’s vindictiveness against commuter communities that would not vote for them and the reconstruction of Port Authority Bus Terminal—will help cities, again, I predict more emphasis on decentralization and activity outside the city.
All this may have salutary cultural implications. The global city is played out. Little of interest happens in New York, San Francisco, London, Paris, or Barcelona. These cities are too expensive for the sort of experimentation that made them great cultural centers and the diffusive nature of the Internet, capitalism, and overtourism have made them all the same. Residents of cities that have been victims of overtourism have seen this as an opportunity to reset, while the physical isolation of cities is going to increase reliance on local institutions. With some luck, all this leads to a new underground, with greater difference creating greater diversity and potential. Of fashion, Bruce Sterling writes, “Fashion will re-appear, and some new style will dominate the 2020s, but the longer it takes to emerge from its morgue-like shadow, the more radically different it will look.” The same could be true of all culture. Globalization was an incredibly powerful force but has been played out. I don’t agree with the protectionist instincts of the Trumpenproles but today culture’s hope is to thrive on the basis of the difference between places and cultures, not on greater sameness. Architecture has been very slow to react to all of this, in part because many intelligent young people have drifted into other fields, like startups, but I am optimistic that we might soon get past the ubiquitous white-painted brick walls and wood common table (the architecture of the least effort possible, to match fashion and food driven by the least effort possible), the tired old Bilbao-effect, and quirky development pseudo-modernism.
So much optimism on my part! Even I am shocked that I am so positive. But why not? The end to this exhausted first phase of network culture is overdue. Time for a new decade, at last.
*The reason for this is that there is no Year Zero. 31 December 1BC is followed by 1 January 1AD.
Network Culture is predictated on an affluent society, but wealth is increasingly relative. As the New York Times reports, earnings that would once have been considered upper class now seem second-rate, especially if one lives in an area like Silicon Valley. The Times also has a story on Robert H. Frank’s Falling Behind: How Rising Inequality Harms the Middle Class. Extreme consumption by the super-rich drives us to try to buy larger houses, better toys, and bigger, more powerful cars even as the average family income has stayed stable since the 1970s.
A "correction" as the Fed likes to call it, or a crash of some sort, seems in the cards, both economically and ecologically. But what other consequences does this have for Network Culture? Is this the last burst of material desire prior to the full dominance of the immaterial? Or is the latter just a superstructure, unavoidably dependent on the former (this might be the argument of the Netlab’s research into logistics, for example)?