internet

Fear of Flying

Iceland's Eyjafjallajoekull volcano hasn't given up disrupting north Atlantic air travel this summer, but what if it's the harbinger of something bigger?

The global city is predicated on face to face communication being essential to major business deals. But the global city model, originally outlined by my colleague Saskia Sassen, is almost twenty years old. Trying booting up your Powerbook 100 to read this blog post. In this post I'd like to speculate on the impact of the volcano, technology, and global warming on the global city.  

First, let's talk global warming and green hype. During the last decade, friendly but misguided green advocates have advocated pedestrian-oriented cities as environmentally-sound alternatives to the suburbs. But looking at America (and many countries in Europe aren't all that different from this), most cities have seen sustained and uninterrupted declines in the last half century. The starring exceptions are the global city of various scales: New York, Chicago, Boston, LA, San Francisco and so on. For the most part, these cities have seen a remarkable renaissance as centers of business and creative activity. The urbanites who live here live in the global city, thinking nothing of jetting from London to Shanghai and alighting in San Francisco. Often, these individuals literally inhabit the global city and owning pied-à-tierres on multiple continents is increasingly as common among the super-wealthy as owning an estate is. At home, the "creative class" practices localism religiously, probably out enjoying home-smoked bacon cupcakes and carbon-neutral triple-pulled ristrettos right now.  

But the idea that this kind of life—which is as predicated on consumption as existence in deepest suburbia—is environmentally sound is laughable. Apart from the manic rate of conspicuous consumption in the global city, flying one mile on an airplane produces almost  as much CO2 as driving that same mile by oneself in an automobile (other side effects, including polluting in the very thin atmosphere high-up may be much worse). Moreover, if an average driver in the United States drives some 12,000 miles a year, that's half of what you need to get into a frequent flyer club.

I think by now you get the picture: the high-flyer of the global city is much worse for the environment than the suburbanite. So much for sustainable living. 

Now back to the volcano. The impact it's had on transatlantic travel has been massive as planes continue to be grounded in one European country or another multiple times a week. Pollution-wise, the amount of CO2 it released is significantly less than the amount of CO2 that would have been produced by the Airbuses and Boeings that happened not to fly on those days (obviously, the volcano also released other pollutants, many of which are quite toxic to life). Business travel had already dropped as a result of the recession. The volcano is a wake-up call. If my business relied on frequent international travel for face-to-face meetings, I'd begin asking myself how sustainable this is from an economic standpoint and how vulnerable my business was to such disruptions.   

There's more to the story. As I stated earlier, we're far from the day of the Powerbook 100, which couldn't even browse the Web. 70% of stock market trades now take place between computers at millisecond-level speeds. I have a hunch that the face-to-face financial deals that used to drive the global financial markets are becoming less important economically. 

Let's put this all together then. A perfect storm is emerging. Far from the idea that the suburbs will collapse in Richard Florida's great reset, it is likely the global city that collapses, replaced by ubiquitous high-speed telecommunications and undone by changing climatological conditions, not to mention peak oil.

Make no mistake, I'm not offering up a new utopia of any sort here. What I'm predicting is an end to network culture as we know it and it won't be pretty. The coming collapse of the global city will be slow and brutal, accompanied by the stationary state that Gopal Balakrishnan described last year.

I don't see many easy solutions out there. Ironically, the best bet is probably the very scare-word the American right loves to deploy: socialism. Now it's unlikely to take hold in the US, at least not for a generation or two but some countries will probably get the drift and head in that direction. What gets us out of this morass and what form of global spatial organization replaces the global city is unclear. Still, the late, great global city was far from equitable or sustainable. We can hardly lament its passing.  

For the Record

Nothing irks me more than the idiots* who say that nobody saw the crash coming. I blogged about it years before it happened. It was plain as day. The real estate market was a bubble. Nothing fundamental had changed.

So for the record, the bump in the stock market today suggests just how fragile the markets are. I've brought this up many times in the networked publics panels, but it's worth mentioning again: high velocity trading is a major threat to the markets and the markets are far from stable.

In literally the blink of an eye the NYSE had dropped over 995 points. It bounced back, but was still down over 350 at the end of the day. 

This isn't the kind of glitch we should ignore. It's a warning underscoring how unsound our financial markets are. Anyone interested in the survival of the current economy system should hope that the Obama administration doesn't ignore it.  

*Of course some of the people saying that nobody saw the crash coming aren't idiots; they're liars.

Goodbye to the Record Store

I spent half of my childhood in the thick of things in Chicago and the other half in rural-exurban Western Massachusetts. It always surprises me when someone says "I can't imagine you in the countryside" (I often fantasize publicly about living in Vermont or somewhere similarly rural). What, Points of Interest in the Owens River Valley wasn't enough for you? 

Since my exurban life came during my all-important teenage years, I found it  crucial to visit the city where I'd scour the record stores or to tune into WRPI, a great industrially-oriented radio station, something I could only do whenever the horrific local Christian station was off the air. When I went to college at Cornell in Ithaca, New York, I was even further from civilization and without even a decent radio station (the college radio station was obsessed with Phish, infinitely worse fate than even classic rock) and so-so record stores. I invested in a short wave radio to listen to the John Peel show (and, when I could get it, the brilliant, ill-fated Radio Sierra Leone) and took painfully long road trips to the city to the same record stores to collect more music.

All this is gone now. I haven't been to a record store in years. I'm a bit of an audiophile so I still keep the best music in CDs but no record store is as efficient as the Net so I even that fix takes place online. In any event the record stores have closed down, the staff off to do God knows what. The scene is gone.

Why do I blog this? Simply enough: the old role of cities as places that you go to in order to experience hard-to-find culture is over. The Nick Hornby novel/film High Fidelity is completely foreign to network culture. Ours is the world of the Long Tail. Everything is available. The city is dead.  

A Chapter on Atemporality

I've put a revised version of the introduction to my book on network culture together with the first chapter—on atemporality—on my site. I hope you'll be as excited to read this material as I am to post it.

I know that I owe my most readers a few words of explanation about why it took over a year to post a chapter that I had initially thought I'd have up within a couple of months.

First, I had the honor of writing a chapter in Networked: A (Networked) Book on (Networked) Art. As part of this project, I agreed that I wouldn't take the material for the chapter and immediately publish it on my own site. That material, like a lot of the research I  did last year requires substantial reworking to fit the book (little of it is in the first chapter…you'll see it later, in the chapter on poetics).

Second, I've thoroughly rethought the book during the intervening year not once but repeatedly. This is hardly a crisis, but rather the way that I—and many historians—write. Revise again and again as you nibble at unformed parts until everything comes together.

Some of you have asked how the revision process works, so I've left the record on the site, just go to the revisions tab for any section and compare the current version with earlier ones. Of all the revisions, the most significant is a new model of historical succession that I find simply works for network culture. Whereas last year I had some uncertainty about just how this book would be a history, the first chapter—which of course is on history—now makes my strategy of relying on Michel Foucault and Jeffrey Nealon's model of intensification emphatically clear.

Speaking of revisions, make no mistake, there are plenty of rough patches in these chapters. This is, after all, a draft. Don't  read it if you want a finished product. But also don't think you should hold back on your commentary. Whether at Networked or at the other ventures including this one, networked books have largely failed at generating comments. Don't let that stop you. If you see a problem in the text call me out on it wherever you feel appropriate. The more that I can draw on the massive collective intelligence of my readership, the better this project wil be.   

While I'm on the topic of collective intelligence… This first chapter owes much to a dialogue that Bruce Sterling and I have maintained between our blogs (take, for example, Bruce’s discussion of atemporality in his keynote address at Transmediale this year) and on Twitter with many of you. All of the kind attention that this dialogue brought during the first few months of the year makes me think that my attempt to write a history of atemporality is both timely and untimely (in Nietzsche’s sense).

Finally, a word about the book title. It's very much in flux now, but I'm thinking it might be "Life After Networks: A Critical History of Network Culture."   

On the iPad's Fatal Flaw

I've had my iPad for a short while and am enjoying it immensely. Anecdotally speaking, I've noticed that people who don't immediately understand how they would want one wind up taking them back to the store or, if they didn't purchase one, sometimes even get hostile (sometimes, even when they should know better because, say, they teaching in the digital media field). 

There's no question anymore that this is a successful implementation of a computing typology that is fundamentally different from either a laptop or a desktop. A tablet computer that is ready to go at a moment notice is great for looking up recipes in the kitchen, for reading a newspaper or a book in the subway, and perfect for taking notes in lectures. It's much less intrusive than a laptop, which can't be held in one hand when standing and creates a barrier between the individual and others in a seminar or classroom. The multitouch interface works much better on the iPad than it does on the iPhone. Of the two, the latter seems like the unit I can more easily live without. 

I take immense pleasure in being able to haul around hundreds of books in a device that weighs less than a copy of Fredric Jameson's Postmodernism book and occupies less space. Highlighting isn't available yet, but it will be soon and with it, full-text search. At that point, the transformation of academic books into immaterial objects will be just a matter of time. I used to care a great deal accumulating a library at home, but if I can have one with me in my bag, then which is more useful? 

Still, don't get me wrong. If a comparable product emerges from another vendor, I will defect immediately. I'm no great fan of the walled garden of applications that Apple has created, nor am I a fan of their "Father Knows Best" attitude toward the user. But everything so far is still vaporware or much less capable, so I'm stuck with the iPad for now.

As promised in the title of this piece, there IS a fatal flaw to the iPad, only it's fatal not to Apple but to the media. There has been a lot of noise about how the iPad would give the media one more chance to survive. I was dubious that the iPad would play Jesus to the media to begin with, but now that Apple has banned applications developed by Adobe's Flash Packager for iPhone, it's game over. 

Where a periodical previously would have been able to develop an issue in Indesign, distribute it in print and over the net, convert it to Flash for non-Apple devices and use Flash Package for Apple devices, now the latter are inaccessible unless the media developer hand codes the application. This is much, much harder. At the Netlab, for example, we would have loved to produce periodicals, pamphlets, and books to read on the iPad  using a workflow consisting of Indesign, Flash, and the media packager, but now this is impossible. I'm not lamenting this too much. It's disappointing, but our material will appear on the Web and as PDFs.

I see no great reason to complain. The Netlab doesn't make money off its publications. But what about commercial periodicals? They'll have to struggle to monetize content on the iPad and that difficulty—precisely at a time when they're struggling just to stay afloat—will prove fatal for many. The rapid pace of creative destruction moves on. 

Read the Infrastructural City

I'm delighted to announce that the good people at m.ammoth.us have organized an online reading group to read the Infrastructural City. Find out more at their site

Like Networked Publics, the Infrastructural City has become a long-term project that goes beyond the bounds of Los Angeles. I'm currently immersed in the Network Culture book, but I have some plans for a follow-up article to my introduction in Infrastructural City later this year and maybe even a book some time later. 

On atemporality

I wanted to lay out some thoughts about atemporality in response to Bruce Sterling's great presentation on the topic over at Transmediale.* We've had a dialogue about this back and forth over the net, in places like Twitter and it's my turn to respond. 

The topic of atemporality is absorbing my time now. I have the goal of getting the first chapter of my book on network culture up by the end of next month (I know, last year I thought it would be the end of March of that year, but so it goes) and it is the core of an article that I'm working on at present for the Cornell Journal of Architecture. 

Anyway, I was impressed by how Bruce framed his argument for network culture. This isn't a new master narrative at all, there's no need to expect the anti-periodization take-down to come, or if it does, it'll be interesting to see the last living postmodernists. Instead, network culture is a given that we need to make sense of. I was also taken by how Bruce gave it an expiry date: it's going to last about a decade before something else comes along. 

Then there's Bruce's tone, always on the verge of laughter. It's classic Bruce, but it's also network culture at work, the realm of 4chan, lolcatz, chatroulette and infinite snark. And I can imagine that one day Bruce will say "It's all a big joke. I mean come on, did you think I was serious about this?" And I'd agree. After all, a colleague once asked me if the Internet wasn't largely garbage, a cultural junkspace devoid of merit? Of course, I said, what do you take me for a fool? She replied by saying she was just wondering since after all, I studied it. I said, well yes, it's mainly dreck but what are you going to do with these eighty trillion virtual pages of dreck, wave your hands and pretend they'll go away? It's not going to happen. So yes, snark is how we talk about this cultural ooze, because that's not only what it deserves, it's what it wants. To adopt a big word from literary criticism: snark is immanent to network culture.   

I was also taken by Bruce's description of early network culture and late network culture. Again, network culture isn't a master narrative. It has no telos or end goal. We're not going to hold up Rem Koolhaas or hypertext or liberalism or the Revolution or the Singularity, Methusalarity or anything else as an end point to history. In that, we part from Hegel definitively. Instead, network culture is transitional. Bruce suggests that it has ten years before something else comes along. He also talks about early network culture, which we're in now, and late network culture, which we can't really anticipate yet.   

I think he's on to something there, but I think we need to make a further division: network culture before and after the crash. The relentless optimism of the pre-crash days is gone, taking starchitecture, Dubai (remember Dubai?), post-criticism, the magazine era, Prada, and hedge fund trading with it. We are in a different phase now, in which portents of collapse are as much part of the discourse as the next big thing. Let's call it the uneasy middle of network culture.

Things are much less sure and they're unlikely to get any better anytime soon. It's going to be a slow ten years, equal to the 70s or maybe somewhere between the 70s and the 30s. Instead of temporary unemployment, we're looking at a massive restructuring in which old industries depart this mortal coil. Please, if you are out of work, don't assume the jobs will return when the recession ends. They won't. They're gone.

But as Bruce suggested, we have to have some fun with network culture. Over at the Netlab research blogs, we're starting to put together a dossier of evidence about practices of atemporality in contemporary culture. You'll be hearing a lot more about atemporality from me over the next month. 

*The talk is below. 

If you prefer, you can now read the transcript online here

The Decade Ahead

It's time for my promised set of predictions for the coming decade. It has been a transgression of disciplinary norms for historians to predict the future, but its also quite common among bloggers. So let's treat this as a blogosphere game, nothing more. It'll be interesting to see just how wildly wrong I am a decade from now.

In many respects, the next decade is likely to seem like a hangover after the party of the 2000s (yes, I said party). The good times of the boom were little more than a lie perpetrated by finance, utterly ungrounded in any economy reality, and were not based on any sustainable economic thought. Honestly, it's unclear to me how much players like Alan Greenspan, Ben Bernanke, Hank Paulson, and Larry Summers were duplicitous and how much they were just duped. Perhaps they thought they would get out in time or drop dead before the bubbly stopped flowing. Or maybe they were just stupid. Either way, we start a decade with national and global economies in ruins. A generation that grew up believing that the world was their oyster is now faced with the same reality that my generation knew growing up: that we would likely be worse off than our parents. I see little to correct this condition and much to be worried about.

Gopal Balakshrishan predicts that the future global economy will be a stationary state, a long-term stagnation akin to that which we experienced in the 1970s and 1980s. China will start slowing. The United States, EU, the Mideast and East Asia will all make up a low growth block, a slowly decaying imperium. India, together with parts of Africa and South America, will be on the rise. To be clear: the very worst thing that could happen is that we would see otherwise. If another bubble forms—in carbon trading or infrastructure for example—watch out. Under network culture, capitalism and finance have parted ways. Hardt and Negri are right: our economy is immaterial now, but that immateriality is not the immateriality of Apple Computer, Google, or Facebook, it's the immateriality of Goldman Sachs and AIG. Whereas under traditional forms of capitalism the stock market was meant to produce returns on investment, a relationship summed up in Marx's equation M-C-M' (where M is money, C is a commodity produced with the money, and M' is money plus surplus value), the financial market now seems to operate under the scheme of M-M' (see Jeffrey Nealon's brilliant Foucault Beyond Foucault). Surplus value is the product of speculation.

There's every chance that I have little idea to what lengths the financial powers will go to continue this condition. After all, I would have said that we should have had a lengthy recession following the dot.com boom and we didn't. Still, the Dow Jones, NASDAQ, house prices (measured in real dollars), and salaries all went down over the course of the decade, so it's plausible to say that for the most part, the economy was a shambles.

Climate change will become more widely accepted as corporations realize that it can lead to consumption and profits when little else can. If we are unlucky, the green "movement" will become a boom. We will finally realize that peak oil has past, perhaps around 2006. Climate change will be very real. It will not be as apocalyptic as some have predicted, but major changes will be in the works. We should expect more major natural disasters, including a tragic toll on human life.   

Populations will be aging worldwide during the next decade and baby boomers will be pulling more money out of their retirement accounts to cover their expenses. At the same time, younger people will find it harder to get a job as the de facto retirement age rises well into the seventies, even the eighties. A greater divide will open up between three classes. At the top, the super-rich will continue controlling national policies and will have the luxury of living in late Roman splendor. A new "upper middle" class will emerge among those who were lucky enough to accumulate some serious cash during the glory days. Below that will come the masses, impossibly in debt from credit cards, college educations, medical bills and nursing home bills for their parents but unable to find jobs that can do anything to pull them out of the mire. The rifts between all three classes will grow, but it's the one between the upper middle class (notice there is no lower middle class anymore) and the new proles that will be the greatest. This is where social unrest will come from, but right now it seems more likely to be from the Right than the Left. Still, there's always hope.

Speaking of hope, if things go right, governments will turn away from get-rich-quick schemes like "creative cities" or speculative financial schemes and instead find ways to build long-term strategies for resurrecting manufacturing. It will be a painful period of restructuring for the creative industries. Old media, the arts, finance, law, advertising, and so on will suffer greatly. Digital media will continue to be a relatively smart choice for a career, even as it becomes more mainstreamed into other professions. For example, it will become as common in schools of architecture to study the design of media environments as it is now to study housing. We will see a rise of cottage industries in developing nations as individuals in their garages will realize that they can produce things with the means of production at hand. Think of eBay and Etsy, but on a greater scale. National health insurance in the US will help in this respect, as it will remove individuals from the need to work for large corporations. But all will not be roses in the world of desktop manufacture. Toxicity caused by garage operations will be a matter of contention in many communities.

Some cities are simply doomed, but if we're lucky, some leaders will turn to intelligent ways of dealing with this condition. To me, the idea of building the world's largest urban farm in Detroit sounds smart. Look for some of these cities—Buffalo maybe?—to follow Berlin's path and become some of the most interesting places to live in the country. If artists and bohemians are finding it impossible to live in places like New York, San Francisco or Los Angeles anymore, they may well turn elsewhere, to the boon of cities formerly in decline. The hippest places to live will no longer be New York or Los Angeles or San Francisco. The move toward smaller cities—remember Athens, Georgia, Austin, Texas and Seattle?—will explode in this decade as the over-capitalized major cities will face crises. But to be clear, this is an inversion from the model of the creative city. These cities will not see real estate values increase greatly. The new classes populating them will not be rich, but rather will turn to a of new DIY bohemianism, cultivating gardens, joining with neighbors communally and building vibrant cultural scenes.

With the death of creative cities, planners will also have to turn toward regions. As jobs continue to empty out, city cores will also see a decline in their fortunes. Eventually, this may resurrect places like New York and San Francisco as interesting places to live in again, but for now, it will cause a crisis. Smart city leaders will form alliances with heads of suburban communities to force greater regional planning than ever before. This will be the decade of the suburbs. We began the last decade with over 50% of the world's population living in urban areas. I predict that by the end of the next decade over 50% of the world's population will live in suburban areas. This isn't just Westchester and Rancho Palos Verdes but rather Garfield, New Jersey and East Los Angeles. Worldwide, it will include the banlieues and the shantytowns. Ending the anti-suburban rhetoric is critical for planners. Instead, we'll be asking how to make suburbs better while boosting the city core. Suburbs may become the models for cities as the focus turns toward devolving government toward local levels, even as tax revenue will be shared across broad regions.

Urban farming will come to the fore and community-supported agriculture will become widespread. This won't just be a movement among the hipster rich. It will spread to the immigrant poor who will realize that they can eat better, healthier, and cheaper by working with members of their immigrant community running farms inside and outside the city instead of shopping at the local supermarket. A few smart mayors will realize that cities in decline need community gardens and these will thrive. The rising cost of long-distance transportation due to the continued decline of infrastructure and peak oil will go a long way toward fostering this new localism.

The divisions in politics will grow. By the end of the decade, the polarization within countries will drive toward hyper-localism. Nonpartisan commissions will study the devolution of power to local governments in areas of education, individual rights (abortion will be illegal in many states, guns in many others), the environment, and so on. In many states gay rights will become accepted, in others, homosexuality may become illegal again. Slowly talk will start on both sides about the US moving toward the model of the EU. Conservatives may drive this initially and the Left will pick it up. In that case, I'm moving to Vermont, no question.

Architects will turn away from starchitecture. Thoughtful books, videos, and Web sites on the field will grow. Parametric modeling will go urban, looking toward GIS. Some of those results will be worth talking about. Responsive architecture will become accepted into the profession as will the idea of architects incorporating interfaces—and interface design—into their work.

In technology, the introduction of the Apple iSlate will make a huge difference in how we view tablets. It will not save media, but it will allow us to interface with it in a new way. eBooks will take hold, as will eBook piracy. Apple itself will suffer as its attempts to make the iSlate a closed platform like the iPhone will lead first to hacks and later to a successful challenge on the basis of unfair restraint of trade. A few years after the introduction of the iSlate, an interface between tablets and keyboards will essentially replace notebook computers. Wine will advance to such a point that the distinction between operating systems will begin to blur. In a move that will initially seem puzzling but will then be brilliant, Microsoft will embrace Wine and encourage its production. By the end of the decade, operating systems will be mere flavors.

The Internet of Things will take hold. An open-source based interface will be the default for televisions, refrigerators, cars and so on. Geolocative, augmented-reality games will become popular. Kevin Slavin will be the Time Web site's Man of the Year in 2018. As mobile network usage continues to grow, network neutrality will become more of an issue until a challenger (maybe Google, maybe not) comes to the scene with a huge amount of bandwidth at its disposal. Fears about Google will rise and by the end of the decade, antitrust hearings will be well-advanced.

We will see substantive steps toward artificial intelligence during the decade. HAL won't be talking to us yet, but the advances in computation will make the technology of 2019 seem far, far ahead of where it is now. The laws of physics will take a toll on Moore's Law, slowing the rate of advance but programmers will turn back toward more elegant, efficient code to get more out of existing hardware.

Manned spaceflight will end in the United States, but the EU, China, and Russia will continue to run the International Space Station, even after one or two life- and station-threatening crises onboard. Eventually there will be a world space consortium established, even as commercial suborbital flights go up a few dozen times a year and unmanned probes to Pluto, Mars, Venus and Europa deliver fantastic results. Earth-like planets will be found in other solar systems and there will be tantalizing hints of microscopic life elsewhere in the solar system even as the mystery of why we have found nobody else in the universe grows.

Toward the end of the decade, there will be signs of the end of network culture. It'll have had a good run of 30 years: the length of one generation. It's at that stage that everything solid will melt into air again, but just how, I have no idea.

As I stated at the outset, this is just a game on the blogosphere, something fun to do after a day of skiing with the family. Do pitch in and offer your own suggestions. I'm eager to hear them.

A Decade in Retrospect

Never mind that the decade really ends in a little over a year, it's time to take stock of it. Today's post looks back at the decade just past while tomorrow's will look at the decade to come.

As I observed before, this decade is marked by atemporality. The greatest symptom of this is our inability to name the decade and, although commentators have tried to dub it the naughties, the aughts, and the 00s (is that pronounced the ooze?), the decade remains, as Paul Krugman suggests, a Big Zero, and we are unable to periodize it. This is not just a matter of linguistic discomfort, its a reflection of the atemporality of network culture. Jean Baudrillard is proved right. History, it seems, came to an end with the millennium, which was a countdown not only to the end of a millennium but also to the end of meaning itself. Perhaps, the Daily Miltonian suggested, we didn't have a name for the decade because it was so bad.

Still, I suspect that we historians are to blame. After Karl Popper and Jean-François Lyotard's condemnation of master narratives, periodizing—or even making broad generalizations about culture—has become deeply suspect for us. Instead, we stick with microhistories on obscure topics while continuing our debates about past periods, damning ourselves into irrelevance. But as I argue in the book that I am currently writing, this has led critical history to a sort of theoretical impasse, reducing it to antiquarianism and removing it from a vital role in understanding contemporary culture. Or rather, history flatlined (as Lewis Lapham predicted), leaving even postmodern pastiche behind for a continuous field in which anything could co-exist with anything else.

Instead of seeing theory consolidate itself, we saw the rise of network theory (a loose amalgam of ideas from the theories of mathematicians like Duncan Watts to journalists like Adam Gopnik) and post-criticism. At times, I felt like I was a lone (or nearly lone) voice against the madding crowd in all this, but times are changing rapidly. Architects and others are finally realizing that the post-critical delirium was an empty delusion. The decade's economic boom, however, had something of the effect of a war on thought. The trend in the humanities is no longer to produce critical theory, it's to get a grant to produce marketable educational software. More than ever, universities are capitalized. The wars on culture are long gone as the Right turned away from this straw man and the university began serving the culture of networked-enduced cool that Alan Liu has written about. The alienated self gave way to what Brian Holmes called the flexible personality. If blogs sometimes questioned this, Geert Lovink pointed out that the questioning was more nihilism than anything else.

But back to the turn of the millennium. This wasn't so much marked by possibility as by delirium. The dot.com boom, the success of the partnership between Thomas Krens and Frank Gehry at the Guggenheim Bilbao, and the emergence of the creative cities movement established the themes for this decade. On March 12, 2000, the tech-heavy NASDAQ index peaked at 4069, twice its value the year before. In the six days following March 16, the index fell by nine percent and it was not through falling until it reached 1114 in August, 2003. If the delirium was revealed, the Bush administration and the Federal Reserve found a tactic to forestall the much-needed correction. Under pretext of striving to avoid full-scale collapse after 9/11, they set out to create artificially low interest rates, deliberately inflating a new bubble. Whether they deliberately understood the consequences of their actions or found themselves unable to stop it, the results were predictable: the second new economy in a decade turned out to be the second bubble in a decade. If, for the most part, tech was calmer, architecture had become infected, virtualized and sucked into the network not to build the corporate data arcologies predicted by William Gibson but as the justification for a highly complex set of financial instruments that seemed to be crafted so as to be impossible to understand by those crafting them. The Dow ended the decade lower than it started, even as national debt doubled. I highly recommend Kevin Phillips book Bad Money: Reckless Finance, Failed Politics, and the Global Crisis of American Capitalism to anyone interested in trying to understand this situation. It's invaluable.

This situation is unlikely to change soon. The crisis was one created by over-accumulation of capital and a long-term slowdown in the economies of developed nations. Here, Robert Brenner's the Economics of Global Turbulence can help my readers map the situation. To say that I'm pessimistic about the next decade is putting it lightly. The powers that be had a critical opportunity to rethink the economy, the environment, and architecture. We have not only failed on all these counts, we have failed egregiously.

It was hardly plausible that the Bush administration would set out to right any of these wrongs, but after the bad years of the Clinton administration, when welfare was dismantled and the Democrats veered to the Right, it seemed unlikely that a Republican presidency could be that much worse. If the Bush administration accomplished anything, they accomplished that, turning into the worst presidency in history. In his review of the decade, Wendell Barry writes "This was a decade during which a man with the equivalent of a sixth grade education appeared to run the Western World." If 9/11 was horrific, the administration's response—most notably the disastrous invasions of Afghanistan and Iraq, alliances with shifty regimes such as Pakistan, and the turn to torture and extraordinary rendition—ensured that the US would be an enemy for many for years to come. By 2004, it was embarrassing for many of us to be American. While I actively thought of leaving, my concerns about the Irish real estate market—later revealed as well-founded—kept me from doing so. Sadly, the first year of the Obama administration, in which he kept in place some of the worst policies and personnel of the Bush administration's policy, received a Nobel peace prize for little more than inspiring hope, and surrounded himself with the very same sorts of financiers that caused the economic collapse in the first place proved the Democrats were hopeless. No Republican could have done as much damage to the Democratic party as their own bumbling leader and deluded strategists did. A historical opportunity has been lost to history. 

Time ended by calling it "the worst decade ever."

For its part, architecture blew it handily. Our field has been in crisis since modernism. More than ever before, architects abandoned ideology for the lottery world of starchitecture. The blame for this has to be laid with the collusive system between architects, critics, developers, museum directors and academics, many of whom were happy as long as they could sit at a table with Frank Gehry or Miuccia Prada. This system failed and failed spectacularly. Little of value was produced in architecture, writing, or history.

Architecture theory also fell victim to post-criticism, its advocates too busy being cool and smooth to offer anything of substance in return. Perhaps the most influential texts for me in this decade were three from the last one: Deleuze's Postscript on the Society of Control, Koolhaas's Junkspace, together with Hardt and Negri's Empire. If I once hoped that some kind of critical history would return, instead I participated in the rise of blog culture. If some of these blogs simply endorsed the world of starchitecture, by the end of the decade young, intelligent voices such as Owen Hatherley, David Gissen, Sam Jacob, Charles Holland, Mimi Zeiger, and Enrique Ramirez, to name only a few, defined a new terrain. My own blog, founded at the start of the decade has a wide readership, allowing me to engage in the role of public intellectual that I've always felt it crucial for academics to pursue.   

Indeed, it's reasonable to say that my blog led me into a new career. Already, a decade ago, I saw the handwriting on the wall for traditional forms of history-theory. Those jobs were and are disappearing, the course hours usurped by the demands of new software, as Stanley Tigerman predicted back in 1992. Instead, as I set out to understand the impact of telecommunications on urbanism, I found that thinkers in architecture were not so much marginal to the discussion as central, if absent. Spending a year at the University of Southern California's Annenberg Center for Communication led me deeper into technology and not only was Networked Publics the result, I was able to lay the groundwork for the sort of research that I am doing at Columbia with my Network Architecture Lab.

The changes in technology were huge. The relatively slow pace of technological developments from the 1950s to the 1980s was left long behind. If television acquired color in the 1960s and cable and the ability to play videotapes in the late 1980s, it was still fundamentally the same thing: a big box with a CRT mounted in it. That's gone forever now, with analog television a mere memory. Computers ceased being big objects, connected via slow telephone links (just sixteen years ago, in 1993, 28k baud modems were the standard) and became light and portable, capable of wireless communications fast enough to make downloading high definition video an everyday occurrence for many. Film photography all but went extinct during the decade as digital imaging technology changed the way we imaged the world. Images proliferated. There are 4 billion digital images on Flickr alone. The culture industry, which had triumphed so thoroughly in the postmodern era, experienced the tribulations that Detroit felt decades before as the music, film, and periodicals all were thrown into crisis by the new culture of free media trade. Through the iPod, the first consumer electronics device released after 9/11, it became possible for us to take with us more music than we would be able to listen to in a year. Media proliferated wildly and illicitly.

For the first time, most people in the world had some form of telecommunication available to them. The cell phone went from a tool of the rich in 1990 to the tool of the middle class in 2000. By 2010, more than 50% of the world's population owned a cell phone, arguably a more important statistic than the fact that at the start of this decade for the first time more people lived in cities than in the country. The cell phone was the first global technological tool. Its impact is only beginning to be felt. In the developed world, not only did most people own cell phones, cell phones themselves became miniature computers, delivering locative media applications such as turn-by-turn navigation, geotagged photos (taken with the built in cameras) together with e-mail, web browsing, and so on. Non-places became a thing of the past as it was impossible to conceive of being isolated anymore. Architects largely didn't have much of a response to this, and parametric design ruled the studios, a game of process that, I suppose, took minds off of what was really happening.

Connections proliferated as well, with social media making it possible for many of us to number our "friends" in the hundreds. Alienation was left behind, at least in its classical terms, as was subjectivity. Hardly individuals anymore, we are, as Deleuze suggested, today, dividuals. Consumer culture left behind the old world of mass media for networked publics (and with it, politics, left behind the mass, the people, and any lingering notion of the public) and the long tail reshaped consumer culture into a world of niches populated by dividuals. If there was some talk about the idea of the multitude or the commons among followers of Hardt and Negri (but also more broadly in terms of the bottom up and the open source movement), there was also a great danger in misunderstanding the role that networks play in consolidating power at the top, a role that those of us in architecture saw first-hand with starchitecture's effects on the discipline. If open source software and competition from the likes of Apple hobbled Microsoft, the rise of Google, iTunes, and Amazon marked a new era of giants, an era that Nicholas Carr covered in the Big Switch (required reading).   

The proliferation of our ability to observe everything and note it also made this the era an era in which the utterly unimportant was relentlessly noted (I said relentlessly constantly during this decade, simply because it was a decade of relentlessness). Nothing, it seemed, was the most important thing of all.

In Discipline and Punish, Foucault wrote, "visibility is a trap." In the old regime of discipline, panopticism made it possible to catch and hold the subject. Visibility was a trap in this decade too, as architects and designers focussed on appearances even as the real story was in the financialization of the field that undid it so thoroughly in 2008 (this was always the lesson of Bilbao… it wasn't finance, not form, that mattered). Realizing this at the start of the decade, Robert Sumrell and I set out to create a consulting firm along the lines of AMO. Within a month or two, we realized that this was a ludicrous idea and AUDC became the animal that it is today, an inheritor to the conceptual traditions of Archizoom, Robert Smithson, and the Center for Land Use Interpretation. Eight years later, we published Blue Monday, a critique of network culture. I don't see any reason why it won't be as valuable—if not more so—in a decade than it is now.   

I've only skimmed the surface of this decade in what is already one of the lengthiest blog posts ever, but over the course of the next year or two hope to do so to come to an understanding of the era we were just in (and continue to be part of) through the network culture book. Stay tuned.

On Death

I'm usually late in sending out holiday greetings and this year is no exception. We had planned to make a physical version of our annual family photo but didn't manage to do it in time for the holidays, so we wound up sending out virtual versions. At least there was snow. I sent out the photo to perhaps 150 friends and colleagues and received the usual 20 bounces. One bittersweet surprise was finding out that my friend Daniel Beunza has moved to the London School of Economics. I'm sure it'll be a great place for him—and he's closer to his home country of Spain—but I'll miss discussions about finance with this remarkable colleague. Much sadder was receiving an automated e-mail from Anne Friedman, another friend with whom I co-wrote the Place chapter of Networked Publics saying that she was on indefinite medical leave. I had received this same message a while back and was concerned, but I didn't get in touch. This time, I looked her up in Google news—just in case—and was saddened to hear that she died this October.

I remember Anne and I talking about how I had discovered that Derek Gross, a college friend who died on 1996 via his Web page. This was before the age of blogs, but Derek updated his Web page regularly and when I visited it to see when his band was next playing, I found he had died, together with a record of his experience. Certainly it's something I had never wished to see again, but just as surely discovering Anne's death via the net is not going to be the final time.   

Anne was a brilliant scholar, as evidenced by her books Window Shopping and the Virtual Window, as well as a great friend. She was crucial for not only my chapter, but also for the Networked Publics group and our book, articulating issues that were fundamental to the project, asking and giving me sage advice throughout. I could not have written the chapter of the book without her. Together we sat in our offices, she in her Lautner House, I in the AUDC studio on Wilshire Boulevard, and wrote the chapter simultaneously on Writely (now Google Docs). In so doing, we experienced the phenomenon of our voices becoming co-mingled, producing a third entity that was neither Anne nor myself. I am heartbroken that there will never be a sequel.

Syndicate content