A Decade in Retrospect

Never mind that the decade really ends in a little over a year, it’s time to take stock of it. Today’s post looks back at the decade just past while tomorrow’s will look at the decade to come.

As I observed before, this decade is marked by atemporality. The greatest symptom of this is our inability to name the decade and, although commentators have tried to dub it the naughties, the aughts, and the 00s (is that pronounced the ooze?), the decade remains, as Paul Krugman suggests, a Big Zero, and we are unable to periodize it. This is not just a matter of linguistic discomfort, its a reflection of the atemporality of network culture. Jean Baudrillard is proved right. History, it seems, came to an end with the millennium, which was a countdown not only to the end of a millennium but also to the end of meaning itself. Perhaps, the Daily Miltonian suggested, we didn’t have a name for the decade because it was so bad.

Still, I suspect that we historians are to blame. After Karl Popper and Jean-François Lyotard’s condemnation of master narratives, periodizing—or even making broad generalizations about culture—has become deeply suspect for us. Instead, we stick with microhistories on obscure topics while continuing our debates about past periods, damning ourselves into irrelevance. But as I argue in the book that I am currently writing, this has led critical history to a sort of theoretical impasse, reducing it to antiquarianism and removing it from a vital role in understanding contemporary culture. Or rather, history flatlined (as Lewis Lapham predicted), leaving even postmodern pastiche behind for a continuous field in which anything could co-exist with anything else.

Instead of seeing theory consolidate itself, we saw the rise of network theory (a loose amalgam of ideas from the theories of mathematicians like Duncan Watts to journalists like Adam Gopnik) and post-criticism. At times, I felt like I was a lone (or nearly lone) voice against the madding crowd in all this, but times are changing rapidly. Architects and others are finally realizing that the post-critical delirium was an empty delusion. The decade’s economic boom, however, had something of the effect of a war on thought. The trend in the humanities is no longer to produce critical theory, it’s to get a grant to produce marketable educational software. More than ever, universities are capitalized. The wars on culture are long gone as the Right turned away from this straw man and the university began serving the culture of networked-enduced cool that Alan Liu has written about. The alienated self gave way to what Brian Holmes called the flexible personality. If blogs sometimes questioned this, Geert Lovink pointed out that the questioning was more nihilism than anything else.

But back to the turn of the millennium. This wasn’t so much marked by possibility as by delirium. The dot.com boom, the success of the partnership between Thomas Krens and Frank Gehry at the Guggenheim Bilbao, and the emergence of the creative cities movement established the themes for this decade. On March 12, 2000, the tech-heavy NASDAQ index peaked at 4069, twice its value the year before. In the six days following March 16, the index fell by nine percent and it was not through falling until it reached 1114 in August, 2003. If the delirium was revealed, the Bush administration and the Federal Reserve found a tactic to forestall the much-needed correction. Under pretext of striving to avoid full-scale collapse after 9/11, they set out to create artificially low interest rates, deliberately inflating a new bubble. Whether they deliberately understood the consequences of their actions or found themselves unable to stop it, the results were predictable: the second new economy in a decade turned out to be the second bubble in a decade. If, for the most part, tech was calmer, architecture had become infected, virtualized and sucked into the network not to build the corporate data arcologies predicted by William Gibson but as the justification for a highly complex set of financial instruments that seemed to be crafted so as to be impossible to understand by those crafting them. The Dow ended the decade lower than it started, even as national debt doubled. I highly recommend Kevin Phillips book Bad Money: Reckless Finance, Failed Politics, and the Global Crisis of American Capitalism to anyone interested in trying to understand this situation. It’s invaluable.

This situation is unlikely to change soon. The crisis was one created by over-accumulation of capital and a long-term slowdown in the economies of developed nations. Here, Robert Brenner’s the Economics of Global Turbulence can help my readers map the situation. To say that I’m pessimistic about the next decade is putting it lightly. The powers that be had a critical opportunity to rethink the economy, the environment, and architecture. We have not only failed on all these counts, we have failed egregiously.

It was hardly plausible that the Bush administration would set out to right any of these wrongs, but after the bad years of the Clinton administration, when welfare was dismantled and the Democrats veered to the Right, it seemed unlikely that a Republican presidency could be that much worse. If the Bush administration accomplished anything, they accomplished that, turning into the worst presidency in history. In his review of the decade, Wendell Barry writes "This was a decade during which a man with the equivalent of a sixth grade education appeared to run the Western World." If 9/11 was horrific, the administration’s response—most notably the disastrous invasions of Afghanistan and Iraq, alliances with shifty regimes such as Pakistan, and the turn to torture and extraordinary rendition—ensured that the US would be an enemy for many for years to come. By 2004, it was embarrassing for many of us to be American. While I actively thought of leaving, my concerns about the Irish real estate market—later revealed as well-founded—kept me from doing so. Sadly, the first year of the Obama administration, in which he kept in place some of the worst policies and personnel of the Bush administration’s policy, received a Nobel peace prize for little more than inspiring hope, and surrounded himself with the very same sorts of financiers that caused the economic collapse in the first place proved the Democrats were hopeless. No Republican could have done as much damage to the Democratic party as their own bumbling leader and deluded strategists did. A historical opportunity has been lost to history. 

Time ended by calling it "the worst decade ever."

For its part, architecture blew it handily. Our field has been in crisis since modernism. More than ever before, architects abandoned ideology for the lottery world of starchitecture. The blame for this has to be laid with the collusive system between architects, critics, developers, museum directors and academics, many of whom were happy as long as they could sit at a table with Frank Gehry or Miuccia Prada. This system failed and failed spectacularly. Little of value was produced in architecture, writing, or history.

Architecture theory also fell victim to post-criticism, its advocates too busy being cool and smooth to offer anything of substance in return. Perhaps the most influential texts for me in this decade were three from the last one: Deleuze’s Postscript on the Society of Control, Koolhaas’s Junkspace, together with Hardt and Negri’s Empire. If I once hoped that some kind of critical history would return, instead I participated in the rise of blog culture. If some of these blogs simply endorsed the world of starchitecture, by the end of the decade young, intelligent voices such as Owen Hatherley, David Gissen, Sam Jacob, Charles Holland, Mimi Zeiger, and Enrique Ramirez, to name only a few, defined a new terrain. My own blog, founded at the start of the decade has a wide readership, allowing me to engage in the role of public intellectual that I’ve always felt it crucial for academics to pursue.   

Indeed, it’s reasonable to say that my blog led me into a new career. Already, a decade ago, I saw the handwriting on the wall for traditional forms of history-theory. Those jobs were and are disappearing, the course hours usurped by the demands of new software, as Stanley Tigerman predicted back in 1992. Instead, as I set out to understand the impact of telecommunications on urbanism, I found that thinkers in architecture were not so much marginal to the discussion as central, if absent. Spending a year at the University of Southern California’s Annenberg Center for Communication led me deeper into technology and not only was Networked Publics the result, I was able to lay the groundwork for the sort of research that I am doing at Columbia with my Network Architecture Lab.

The changes in technology were huge. The relatively slow pace of technological developments from the 1950s to the 1980s was left long behind. If television acquired color in the 1960s and cable and the ability to play videotapes in the late 1980s, it was still fundamentally the same thing: a big box with a CRT mounted in it. That’s gone forever now, with analog television a mere memory. Computers ceased being big objects, connected via slow telephone links (just sixteen years ago, in 1993, 28k baud modems were the standard) and became light and portable, capable of wireless communications fast enough to make downloading high definition video an everyday occurrence for many. Film photography all but went extinct during the decade as digital imaging technology changed the way we imaged the world. Images proliferated. There are 4 billion digital images on Flickr alone. The culture industry, which had triumphed so thoroughly in the postmodern era, experienced the tribulations that Detroit felt decades before as the music, film, and periodicals all were thrown into crisis by the new culture of free media trade. Through the iPod, the first consumer electronics device released after 9/11, it became possible for us to take with us more music than we would be able to listen to in a year. Media proliferated wildly and illicitly.

For the first time, most people in the world had some form of telecommunication available to them. The cell phone went from a tool of the rich in 1990 to the tool of the middle class in 2000. By 2010, more than 50% of the world’s population owned a cell phone, arguably a more important statistic than the fact that at the start of this decade for the first time more people lived in cities than in the country. The cell phone was the first global technological tool. Its impact is only beginning to be felt. In the developed world, not only did most people own cell phones, cell phones themselves became miniature computers, delivering locative media applications such as turn-by-turn navigation, geotagged photos (taken with the built in cameras) together with e-mail, web browsing, and so on. Non-places became a thing of the past as it was impossible to conceive of being isolated anymore. Architects largely didn’t have much of a response to this, and parametric design ruled the studios, a game of process that, I suppose, took minds off of what was really happening.

Connections proliferated as well, with social media making it possible for many of us to number our "friends" in the hundreds. Alienation was left behind, at least in its classical terms, as was subjectivity. Hardly individuals anymore, we are, as Deleuze suggested, today, dividuals. Consumer culture left behind the old world of mass media for networked publics (and with it, politics, left behind the mass, the people, and any lingering notion of the public) and the long tail reshaped consumer culture into a world of niches populated by dividuals. If there was some talk about the idea of the multitude or the commons among followers of Hardt and Negri (but also more broadly in terms of the bottom up and the open source movement), there was also a great danger in misunderstanding the role that networks play in consolidating power at the top, a role that those of us in architecture saw first-hand with starchitecture’s effects on the discipline. If open source software and competition from the likes of Apple hobbled Microsoft, the rise of Google, iTunes, and Amazon marked a new era of giants, an era that Nicholas Carr covered in the Big Switch (required reading).   

The proliferation of our ability to observe everything and note it also made this the era an era in which the utterly unimportant was relentlessly noted (I said relentlessly constantly during this decade, simply because it was a decade of relentlessness). Nothing, it seemed, was the most important thing of all.

In Discipline and Punish, Foucault wrote, "visibility is a trap." In the old regime of discipline, panopticism made it possible to catch and hold the subject. Visibility was a trap in this decade too, as architects and designers focussed on appearances even as the real story was in the financialization of the field that undid it so thoroughly in 2008 (this was always the lesson of Bilbao… it wasn’t finance, not form, that mattered). Realizing this at the start of the decade, Robert Sumrell and I set out to create a consulting firm along the lines of AMO. Within a month or two, we realized that this was a ludicrous idea and AUDC became the animal that it is today, an inheritor to the conceptual traditions of Archizoom, Robert Smithson, and the Center for Land Use Interpretation. Eight years later, we published Blue Monday, a critique of network culture. I don’t see any reason why it won’t be as valuable—if not more so—in a decade than it is now.   

I’ve only skimmed the surface of this decade in what is already one of the lengthiest blog posts ever, but over the course of the next year or two hope to do so to come to an understanding of the era we were just in (and continue to be part of) through the network culture book. Stay tuned.

Never mind that the decade really ends in a little over a year, it’s time to take stock of it. Today’s post looks back at the decade just past while tomorrow’s will look at the decade to come.

As I observed before, this decade is marked by atemporality. The greatest symptom of this is our inability to name the decade and, although commentators have tried to dub it the naughties, the aughts, and the 00s (is that pronounced the ooze?), the decade remains, as Paul Krugman suggests, a Big Zero, and we are unable to periodize it. This is not just a matter of linguistic discomfort, its a reflection of the atemporality of network culture. Jean Baudrillard is proved right. History, it seems, came to an end with the millennium, which was a countdown not only to the end of a millennium but also to the end of meaning itself. Perhaps, the Daily Miltonian suggested, we didn’t have a name for the decade because it was so bad.

Still, I suspect that we historians are to blame. After Karl Popper and Jean-François Lyotard’s condemnation of master narratives, periodizing—or even making broad generalizations about culture—has become deeply suspect for us. Instead, we stick with microhistories on obscure topics while continuing our debates about past periods, damning ourselves into irrelevance. But as I argue in the book that I am currently writing, this has led critical history to a sort of theoretical impasse, reducing it to antiquarianism and removing it from a vital role in understanding contemporary culture. Or rather, history flatlined (as Lewis Lapham predicted), leaving even postmodern pastiche behind for a continuous field in which anything could co-exist with anything else.

Instead of seeing theory consolidate itself, we saw the rise of network theory (a loose amalgam of ideas from the theories of mathematicians like Duncan Watts to journalists like Adam Gopnik) and post-criticism. At times, I felt like I was a lone (or nearly lone) voice against the madding crowd in all this, but times are changing rapidly. Architects and others are finally realizing that the post-critical delirium was an empty delusion. The decade’s economic boom, however, had something of the effect of a war on thought. The trend in the humanities is no longer to produce critical theory, it’s to get a grant to produce marketable educational software. More than ever, universities are capitalized. The wars on culture are long gone as the Right turned away from this straw man and the university began serving the culture of networked-enduced cool that Alan Liu has written about. The alienated self gave way to what Brian Holmes called the flexible personality. If blogs sometimes questioned this, Geert Lovink pointed out that the questioning was more nihilism than anything else.

But back to the turn of the millennium. This wasn’t so much marked by possibility as by delirium. The dot.com boom, the success of the partnership between Thomas Krens and Frank Gehry at the Guggenheim Bilbao, and the emergence of the creative cities movement established the themes for this decade. On March 12, 2000, the tech-heavy NASDAQ index peaked at 4069, twice its value the year before. In the six days following March 16, the index fell by nine percent and it was not through falling until it reached 1114 in August, 2003. If the delirium was revealed, the Bush administration and the Federal Reserve found a tactic to forestall the much-needed correction. Under pretext of striving to avoid full-scale collapse after 9/11, they set out to create artificially low interest rates, deliberately inflating a new bubble. Whether they deliberately understood the consequences of their actions or found themselves unable to stop it, the results were predictable: the second new economy in a decade turned out to be the second bubble in a decade. If, for the most part, tech was calmer, architecture had become infected, virtualized and sucked into the network not to build the corporate data arcologies predicted by William Gibson but as the justification for a highly complex set of financial instruments that seemed to be crafted so as to be impossible to understand by those crafting them. The Dow ended the decade lower than it started, even as national debt doubled. I highly recommend Kevin Phillips book Bad Money: Reckless Finance, Failed Politics, and the Global Crisis of American Capitalism to anyone interested in trying to understand this situation. It’s invaluable.

This situation is unlikely to change soon. The crisis was one created by over-accumulation of capital and a long-term slowdown in the economies of developed nations. Here, Robert Brenner’s the Economics of Global Turbulence can help my readers map the situation. To say that I’m pessimistic about the next decade is putting it lightly. The powers that be had a critical opportunity to rethink the economy, the environment, and architecture. We have not only failed on all these counts, we have failed egregiously.

It was hardly plausible that the Bush administration would set out to right any of these wrongs, but after the bad years of the Clinton administration, when welfare was dismantled and the Democrats veered to the Right, it seemed unlikely that a Republican presidency could be that much worse. If the Bush administration accomplished anything, they accomplished that, turning into the worst presidency in history. In his review of the decade, Wendell Barry writes "This was a decade during which a man with the equivalent of a sixth grade education appeared to run the Western World." If 9/11 was horrific, the administration’s response—most notably the disastrous invasions of Afghanistan and Iraq, alliances with shifty regimes such as Pakistan, and the turn to torture and extraordinary rendition—ensured that the US would be an enemy for many for years to come. By 2004, it was embarrassing for many of us to be American. While I actively thought of leaving, my concerns about the Irish real estate market—later revealed as well-founded—kept me from doing so. Sadly, the first year of the Obama administration, in which he kept in place some of the worst policies and personnel of the Bush administration’s policy, received a Nobel peace prize for little more than inspiring hope, and surrounded himself with the very same sorts of financiers that caused the economic collapse in the first place proved the Democrats were hopeless. No Republican could have done as much damage to the Democratic party as their own bumbling leader and deluded strategists did. A historical opportunity has been lost to history. 

Time ended by calling it "the worst decade ever."

For its part, architecture blew it handily. Our field has been in crisis since modernism. More than ever before, architects abandoned ideology for the lottery world of starchitecture. The blame for this has to be laid with the collusive system between architects, critics, developers, museum directors and academics, many of whom were happy as long as they could sit at a table with Frank Gehry or Miuccia Prada. This system failed and failed spectacularly. Little of value was produced in architecture, writing, or history.

Architecture theory also fell victim to post-criticism, its advocates too busy being cool and smooth to offer anything of substance in return. Perhaps the most influential texts for me in this decade were three from the last one: Deleuze’s Postscript on the Society of Control, Koolhaas’s Junkspace, together with Hardt and Negri’s Empire. If I once hoped that some kind of critical history would return, instead I participated in the rise of blog culture. If some of these blogs simply endorsed the world of starchitecture, by the end of the decade young, intelligent voices such as Owen Hatherley, David Gissen, Sam Jacob, Charles Holland, Mimi Zeiger, and Enrique Ramirez, to name only a few, defined a new terrain. My own blog, founded at the start of the decade has a wide readership, allowing me to engage in the role of public intellectual that I’ve always felt it crucial for academics to pursue.   

Indeed, it’s reasonable to say that my blog led me into a new career. Already, a decade ago, I saw the handwriting on the wall for traditional forms of history-theory. Those jobs were and are disappearing, the course hours usurped by the demands of new software, as Stanley Tigerman predicted back in 1992. Instead, as I set out to understand the impact of telecommunications on urbanism, I found that thinkers in architecture were not so much marginal to the discussion as central, if absent. Spending a year at the University of Southern California’s Annenberg Center for Communication led me deeper into technology and not only was Networked Publics the result, I was able to lay the groundwork for the sort of research that I am doing at Columbia with my Network Architecture Lab.

The changes in technology were huge. The relatively slow pace of technological developments from the 1950s to the 1980s was left long behind. If television acquired color in the 1960s and cable and the ability to play videotapes in the late 1980s, it was still fundamentally the same thing: a big box with a CRT mounted in it. That’s gone forever now, with analog television a mere memory. Computers ceased being big objects, connected via slow telephone links (just sixteen years ago, in 1993, 28k baud modems were the standard) and became light and portable, capable of wireless communications fast enough to make downloading high definition video an everyday occurrence for many. Film photography all but went extinct during the decade as digital imaging technology changed the way we imaged the world. Images proliferated. There are 4 billion digital images on Flickr alone. The culture industry, which had triumphed so thoroughly in the postmodern era, experienced the tribulations that Detroit felt decades before as the music, film, and periodicals all were thrown into crisis by the new culture of free media trade. Through the iPod, the first consumer electronics device released after 9/11, it became possible for us to take with us more music than we would be able to listen to in a year. Media proliferated wildly and illicitly.

For the first time, most people in the world had some form of telecommunication available to them. The cell phone went from a tool of the rich in 1990 to the tool of the middle class in 2000. By 2010, more than 50% of the world’s population owned a cell phone, arguably a more important statistic than the fact that at the start of this decade for the first time more people lived in cities than in the country. The cell phone was the first global technological tool. Its impact is only beginning to be felt. In the developed world, not only did most people own cell phones, cell phones themselves became miniature computers, delivering locative media applications such as turn-by-turn navigation, geotagged photos (taken with the built in cameras) together with e-mail, web browsing, and so on. Non-places became a thing of the past as it was impossible to conceive of being isolated anymore. Architects largely didn’t have much of a response to this, and parametric design ruled the studios, a game of process that, I suppose, took minds off of what was really happening.

Connections proliferated as well, with social media making it possible for many of us to number our "friends" in the hundreds. Alienation was left behind, at least in its classical terms, as was subjectivity. Hardly individuals anymore, we are, as Deleuze suggested, today, dividuals. Consumer culture left behind the old world of mass media for networked publics (and with it, politics, left behind the mass, the people, and any lingering notion of the public) and the long tail reshaped consumer culture into a world of niches populated by dividuals. If there was some talk about the idea of the multitude or the commons among followers of Hardt and Negri (but also more broadly in terms of the bottom up and the open source movement), there was also a great danger in misunderstanding the role that networks play in consolidating power at the top, a role that those of us in architecture saw first-hand with starchitecture’s effects on the discipline. If open source software and competition from the likes of Apple hobbled Microsoft, the rise of Google, iTunes, and Amazon marked a new era of giants, an era that Nicholas Carr covered in the Big Switch (required reading).   

The proliferation of our ability to observe everything and note it also made this the era an era in which the utterly unimportant was relentlessly noted (I said relentlessly constantly during this decade, simply because it was a decade of relentlessness). Nothing, it seemed, was the most important thing of all.

In Discipline and Punish, Foucault wrote, "visibility is a trap." In the old regime of discipline, panopticism made it possible to catch and hold the subject. Visibility was a trap in this decade too, as architects and designers focussed on appearances even as the real story was in the financialization of the field that undid it so thoroughly in 2008 (this was always the lesson of Bilbao… it wasn’t finance, not form, that mattered). Realizing this at the start of the decade, Robert Sumrell and I set out to create a consulting firm along the lines of AMO. Within a month or two, we realized that this was a ludicrous idea and AUDC became the animal that it is today, an inheritor to the conceptual traditions of Archizoom, Robert Smithson, and the Center for Land Use Interpretation. Eight years later, we published Blue Monday, a critique of network culture. I don’t see any reason why it won’t be as valuable—if not more so—in a decade than it is now.   

I’ve only skimmed the surface of this decade in what is already one of the lengthiest blog posts ever, but over the course of the next year or two hope to do so to come to an understanding of the era we were just in (and continue to be part of) through the network culture book. Stay tuned.

Leave a Comment