On atemporality

I wanted to lay out some thoughts about atemporality in response to Bruce Sterling’s great presentation on the topic over at Transmediale.* We’ve had a dialogue about this back and forth over the net, in places like Twitter and it’s my turn to respond. 

The topic of atemporality is absorbing my time now. I have the goal of getting the first chapter of my book on network culture up by the end of next month (I know, last year I thought it would be the end of March of that year, but so it goes) and it is the core of an article that I’m working on at present for the Cornell Journal of Architecture. 

Anyway, I was impressed by how Bruce framed his argument for network culture. This isn’t a new master narrative at all, there’s no need to expect the anti-periodization take-down to come, or if it does, it’ll be interesting to see the last living postmodernists. Instead, network culture is a given that we need to make sense of. I was also taken by how Bruce gave it an expiry date: it’s going to last about a decade before something else comes along. 

Then there’s Bruce’s tone, always on the verge of laughter. It’s classic Bruce, but it’s also network culture at work, the realm of 4chan, lolcatz, chatroulette and infinite snark. And I can imagine that one day Bruce will say "It’s all a big joke. I mean come on, did you think I was serious about this?" And I’d agree. After all, a colleague once asked me if the Internet wasn’t largely garbage, a cultural junkspace devoid of merit? Of course, I said, what do you take me for a fool? She replied by saying she was just wondering since after all, I studied it. I said, well yes, it’s mainly dreck but what are you going to do with these eighty trillion virtual pages of dreck, wave your hands and pretend they’ll go away? It’s not going to happen. So yes, snark is how we talk about this cultural ooze, because that’s not only what it deserves, it’s what it wants. To adopt a big word from literary criticism: snark is immanent to network culture.   

I was also taken by Bruce’s description of early network culture and late network culture. Again, network culture isn’t a master narrative. It has no telos or end goal. We’re not going to hold up Rem Koolhaas or hypertext or liberalism or the Revolution or the Singularity, Methusalarity or anything else as an end point to history. In that, we part from Hegel definitively. Instead, network culture is transitional. Bruce suggests that it has ten years before something else comes along. He also talks about early network culture, which we’re in now, and late network culture, which we can’t really anticipate yet.   

I think he’s on to something there, but I think we need to make a further division: network culture before and after the crash. The relentless optimism of the pre-crash days is gone, taking starchitecture, Dubai (remember Dubai?), post-criticism, the magazine era, Prada, and hedge fund trading with it. We are in a different phase now, in which portents of collapse are as much part of the discourse as the next big thing. Let’s call it the uneasy middle of network culture.

Things are much less sure and they’re unlikely to get any better anytime soon. It’s going to be a slow ten years, equal to the 70s or maybe somewhere between the 70s and the 30s. Instead of temporary unemployment, we’re looking at a massive restructuring in which old industries depart this mortal coil. Please, if you are out of work, don’t assume the jobs will return when the recession ends. They won’t. They’re gone.

But as Bruce suggested, we have to have some fun with network culture. Over at the Netlab research blogs, we’re starting to put together a dossier of evidence about practices of atemporality in contemporary culture. You’ll be hearing a lot more about atemporality from me over the next month. 

*The talk is below. 

If you prefer, you can now read the transcript online here

Read more

The Decade Ahead

It’s time for my promised set of predictions for the coming decade. It has been a transgression of disciplinary norms for historians to predict the future, but its also quite common among bloggers. So let’s treat this as a blogosphere game, nothing more. It’ll be interesting to see just how wildly wrong I am a decade from now.

In many respects, the next decade is likely to seem like a hangover after the party of the 2000s (yes, I said party). The good times of the boom were little more than a lie perpetrated by finance, utterly ungrounded in any economy reality, and were not based on any sustainable economic thought. Honestly, it’s unclear to me how much players like Alan Greenspan, Ben Bernanke, Hank Paulson, and Larry Summers were duplicitous and how much they were just duped. Perhaps they thought they would get out in time or drop dead before the bubbly stopped flowing. Or maybe they were just stupid. Either way, we start a decade with national and global economies in ruins. A generation that grew up believing that the world was their oyster is now faced with the same reality that my generation knew growing up: that we would likely be worse off than our parents. I see little to correct this condition and much to be worried about.

Gopal Balakshrishan predicts that the future global economy will be a stationary state, a long-term stagnation akin to that which we experienced in the 1970s and 1980s. China will start slowing. The United States, EU, the Mideast and East Asia will all make up a low growth block, a slowly decaying imperium. India, together with parts of Africa and South America, will be on the rise. To be clear: the very worst thing that could happen is that we would see otherwise. If another bubble forms—in carbon trading or infrastructure for example—watch out. Under network culture, capitalism and finance have parted ways. Hardt and Negri are right: our economy is immaterial now, but that immateriality is not the immateriality of Apple Computer, Google, or Facebook, it’s the immateriality of Goldman Sachs and AIG. Whereas under traditional forms of capitalism the stock market was meant to produce returns on investment, a relationship summed up in Marx’s equation M-C-M’ (where M is money, C is a commodity produced with the money, and M’ is money plus surplus value), the financial market now seems to operate under the scheme of M-M’ (see Jeffrey Nealon’s brilliant Foucault Beyond Foucault). Surplus value is the product of speculation.

There’s every chance that I have little idea to what lengths the financial powers will go to continue this condition. After all, I would have said that we should have had a lengthy recession following the dot.com boom and we didn’t. Still, the Dow Jones, NASDAQ, house prices (measured in real dollars), and salaries all went down over the course of the decade, so it’s plausible to say that for the most part, the economy was a shambles.

Climate change will become more widely accepted as corporations realize that it can lead to consumption and profits when little else can. If we are unlucky, the green "movement" will become a boom. We will finally realize that peak oil has past, perhaps around 2006. Climate change will be very real. It will not be as apocalyptic as some have predicted, but major changes will be in the works. We should expect more major natural disasters, including a tragic toll on human life.   

Populations will be aging worldwide during the next decade and baby boomers will be pulling more money out of their retirement accounts to cover their expenses. At the same time, younger people will find it harder to get a job as the de facto retirement age rises well into the seventies, even the eighties. A greater divide will open up between three classes. At the top, the super-rich will continue controlling national policies and will have the luxury of living in late Roman splendor. A new "upper middle" class will emerge among those who were lucky enough to accumulate some serious cash during the glory days. Below that will come the masses, impossibly in debt from credit cards, college educations, medical bills and nursing home bills for their parents but unable to find jobs that can do anything to pull them out of the mire. The rifts between all three classes will grow, but it’s the one between the upper middle class (notice there is no lower middle class anymore) and the new proles that will be the greatest. This is where social unrest will come from, but right now it seems more likely to be from the Right than the Left. Still, there’s always hope.

Speaking of hope, if things go right, governments will turn away from get-rich-quick schemes like "creative cities" or speculative financial schemes and instead find ways to build long-term strategies for resurrecting manufacturing. It will be a painful period of restructuring for the creative industries. Old media, the arts, finance, law, advertising, and so on will suffer greatly. Digital media will continue to be a relatively smart choice for a career, even as it becomes more mainstreamed into other professions. For example, it will become as common in schools of architecture to study the design of media environments as it is now to study housing. We will see a rise of cottage industries in developing nations as individuals in their garages will realize that they can produce things with the means of production at hand. Think of eBay and Etsy, but on a greater scale. National health insurance in the US will help in this respect, as it will remove individuals from the need to work for large corporations. But all will not be roses in the world of desktop manufacture. Toxicity caused by garage operations will be a matter of contention in many communities.

Some cities are simply doomed, but if we’re lucky, some leaders will turn to intelligent ways of dealing with this condition. To me, the idea of building the world’s largest urban farm in Detroit sounds smart. Look for some of these cities—Buffalo maybe?—to follow Berlin’s path and become some of the most interesting places to live in the country. If artists and bohemians are finding it impossible to live in places like New York, San Francisco or Los Angeles anymore, they may well turn elsewhere, to the boon of cities formerly in decline. The hippest places to live will no longer be New York or Los Angeles or San Francisco. The move toward smaller cities—remember Athens, Georgia, Austin, Texas and Seattle?—will explode in this decade as the over-capitalized major cities will face crises. But to be clear, this is an inversion from the model of the creative city. These cities will not see real estate values increase greatly. The new classes populating them will not be rich, but rather will turn to a of new DIY bohemianism, cultivating gardens, joining with neighbors communally and building vibrant cultural scenes.

With the death of creative cities, planners will also have to turn toward regions. As jobs continue to empty out, city cores will also see a decline in their fortunes. Eventually, this may resurrect places like New York and San Francisco as interesting places to live in again, but for now, it will cause a crisis. Smart city leaders will form alliances with heads of suburban communities to force greater regional planning than ever before. This will be the decade of the suburbs. We began the last decade with over 50% of the world’s population living in urban areas. I predict that by the end of the next decade over 50% of the world’s population will live in suburban areas. This isn’t just Westchester and Rancho Palos Verdes but rather Garfield, New Jersey and East Los Angeles. Worldwide, it will include the banlieues and the shantytowns. Ending the anti-suburban rhetoric is critical for planners. Instead, we’ll be asking how to make suburbs better while boosting the city core. Suburbs may become the models for cities as the focus turns toward devolving government toward local levels, even as tax revenue will be shared across broad regions.

Urban farming will come to the fore and community-supported agriculture will become widespread. This won’t just be a movement among the hipster rich. It will spread to the immigrant poor who will realize that they can eat better, healthier, and cheaper by working with members of their immigrant community running farms inside and outside the city instead of shopping at the local supermarket. A few smart mayors will realize that cities in decline need community gardens and these will thrive. The rising cost of long-distance transportation due to the continued decline of infrastructure and peak oil will go a long way toward fostering this new localism.

The divisions in politics will grow. By the end of the decade, the polarization within countries will drive toward hyper-localism. Nonpartisan commissions will study the devolution of power to local governments in areas of education, individual rights (abortion will be illegal in many states, guns in many others), the environment, and so on. In many states gay rights will become accepted, in others, homosexuality may become illegal again. Slowly talk will start on both sides about the US moving toward the model of the EU. Conservatives may drive this initially and the Left will pick it up. In that case, I’m moving to Vermont, no question.

Architects will turn away from starchitecture. Thoughtful books, videos, and Web sites on the field will grow. Parametric modeling will go urban, looking toward GIS. Some of those results will be worth talking about. Responsive architecture will become accepted into the profession as will the idea of architects incorporating interfaces—and interface design—into their work.

In technology, the introduction of the Apple iSlate will make a huge difference in how we view tablets. It will not save media, but it will allow us to interface with it in a new way. eBooks will take hold, as will eBook piracy. Apple itself will suffer as its attempts to make the iSlate a closed platform like the iPhone will lead first to hacks and later to a successful challenge on the basis of unfair restraint of trade. A few years after the introduction of the iSlate, an interface between tablets and keyboards will essentially replace notebook computers. Wine will advance to such a point that the distinction between operating systems will begin to blur. In a move that will initially seem puzzling but will then be brilliant, Microsoft will embrace Wine and encourage its production. By the end of the decade, operating systems will be mere flavors.

The Internet of Things will take hold. An open-source based interface will be the default for televisions, refrigerators, cars and so on. Geolocative, augmented-reality games will become popular. Kevin Slavin will be the Time Web site’s Man of the Year in 2018. As mobile network usage continues to grow, network neutrality will become more of an issue until a challenger (maybe Google, maybe not) comes to the scene with a huge amount of bandwidth at its disposal. Fears about Google will rise and by the end of the decade, antitrust hearings will be well-advanced.

We will see substantive steps toward artificial intelligence during the decade. HAL won’t be talking to us yet, but the advances in computation will make the technology of 2019 seem far, far ahead of where it is now. The laws of physics will take a toll on Moore’s Law, slowing the rate of advance but programmers will turn back toward more elegant, efficient code to get more out of existing hardware.

Manned spaceflight will end in the United States, but the EU, China, and Russia will continue to run the International Space Station, even after one or two life- and station-threatening crises onboard. Eventually there will be a world space consortium established, even as commercial suborbital flights go up a few dozen times a year and unmanned probes to Pluto, Mars, Venus and Europa deliver fantastic results. Earth-like planets will be found in other solar systems and there will be tantalizing hints of microscopic life elsewhere in the solar system even as the mystery of why we have found nobody else in the universe grows.

Toward the end of the decade, there will be signs of the end of network culture. It’ll have had a good run of 30 years: the length of one generation. It’s at that stage that everything solid will melt into air again, but just how, I have no idea.

As I stated at the outset, this is just a game on the blogosphere, something fun to do after a day of skiing with the family. Do pitch in and offer your own suggestions. I’m eager to hear them.

Read more

A Decade in Retrospect

Never mind that the decade really ends in a little over a year, it’s time to take stock of it. Today’s post looks back at the decade just past while tomorrow’s will look at the decade to come.

As I observed before, this decade is marked by atemporality. The greatest symptom of this is our inability to name the decade and, although commentators have tried to dub it the naughties, the aughts, and the 00s (is that pronounced the ooze?), the decade remains, as Paul Krugman suggests, a Big Zero, and we are unable to periodize it. This is not just a matter of linguistic discomfort, its a reflection of the atemporality of network culture. Jean Baudrillard is proved right. History, it seems, came to an end with the millennium, which was a countdown not only to the end of a millennium but also to the end of meaning itself. Perhaps, the Daily Miltonian suggested, we didn’t have a name for the decade because it was so bad.

Still, I suspect that we historians are to blame. After Karl Popper and Jean-François Lyotard’s condemnation of master narratives, periodizing—or even making broad generalizations about culture—has become deeply suspect for us. Instead, we stick with microhistories on obscure topics while continuing our debates about past periods, damning ourselves into irrelevance. But as I argue in the book that I am currently writing, this has led critical history to a sort of theoretical impasse, reducing it to antiquarianism and removing it from a vital role in understanding contemporary culture. Or rather, history flatlined (as Lewis Lapham predicted), leaving even postmodern pastiche behind for a continuous field in which anything could co-exist with anything else.

Instead of seeing theory consolidate itself, we saw the rise of network theory (a loose amalgam of ideas from the theories of mathematicians like Duncan Watts to journalists like Adam Gopnik) and post-criticism. At times, I felt like I was a lone (or nearly lone) voice against the madding crowd in all this, but times are changing rapidly. Architects and others are finally realizing that the post-critical delirium was an empty delusion. The decade’s economic boom, however, had something of the effect of a war on thought. The trend in the humanities is no longer to produce critical theory, it’s to get a grant to produce marketable educational software. More than ever, universities are capitalized. The wars on culture are long gone as the Right turned away from this straw man and the university began serving the culture of networked-enduced cool that Alan Liu has written about. The alienated self gave way to what Brian Holmes called the flexible personality. If blogs sometimes questioned this, Geert Lovink pointed out that the questioning was more nihilism than anything else.

But back to the turn of the millennium. This wasn’t so much marked by possibility as by delirium. The dot.com boom, the success of the partnership between Thomas Krens and Frank Gehry at the Guggenheim Bilbao, and the emergence of the creative cities movement established the themes for this decade. On March 12, 2000, the tech-heavy NASDAQ index peaked at 4069, twice its value the year before. In the six days following March 16, the index fell by nine percent and it was not through falling until it reached 1114 in August, 2003. If the delirium was revealed, the Bush administration and the Federal Reserve found a tactic to forestall the much-needed correction. Under pretext of striving to avoid full-scale collapse after 9/11, they set out to create artificially low interest rates, deliberately inflating a new bubble. Whether they deliberately understood the consequences of their actions or found themselves unable to stop it, the results were predictable: the second new economy in a decade turned out to be the second bubble in a decade. If, for the most part, tech was calmer, architecture had become infected, virtualized and sucked into the network not to build the corporate data arcologies predicted by William Gibson but as the justification for a highly complex set of financial instruments that seemed to be crafted so as to be impossible to understand by those crafting them. The Dow ended the decade lower than it started, even as national debt doubled. I highly recommend Kevin Phillips book Bad Money: Reckless Finance, Failed Politics, and the Global Crisis of American Capitalism to anyone interested in trying to understand this situation. It’s invaluable.

This situation is unlikely to change soon. The crisis was one created by over-accumulation of capital and a long-term slowdown in the economies of developed nations. Here, Robert Brenner’s the Economics of Global Turbulence can help my readers map the situation. To say that I’m pessimistic about the next decade is putting it lightly. The powers that be had a critical opportunity to rethink the economy, the environment, and architecture. We have not only failed on all these counts, we have failed egregiously.

It was hardly plausible that the Bush administration would set out to right any of these wrongs, but after the bad years of the Clinton administration, when welfare was dismantled and the Democrats veered to the Right, it seemed unlikely that a Republican presidency could be that much worse. If the Bush administration accomplished anything, they accomplished that, turning into the worst presidency in history. In his review of the decade, Wendell Barry writes "This was a decade during which a man with the equivalent of a sixth grade education appeared to run the Western World." If 9/11 was horrific, the administration’s response—most notably the disastrous invasions of Afghanistan and Iraq, alliances with shifty regimes such as Pakistan, and the turn to torture and extraordinary rendition—ensured that the US would be an enemy for many for years to come. By 2004, it was embarrassing for many of us to be American. While I actively thought of leaving, my concerns about the Irish real estate market—later revealed as well-founded—kept me from doing so. Sadly, the first year of the Obama administration, in which he kept in place some of the worst policies and personnel of the Bush administration’s policy, received a Nobel peace prize for little more than inspiring hope, and surrounded himself with the very same sorts of financiers that caused the economic collapse in the first place proved the Democrats were hopeless. No Republican could have done as much damage to the Democratic party as their own bumbling leader and deluded strategists did. A historical opportunity has been lost to history. 

Time ended by calling it "the worst decade ever."

For its part, architecture blew it handily. Our field has been in crisis since modernism. More than ever before, architects abandoned ideology for the lottery world of starchitecture. The blame for this has to be laid with the collusive system between architects, critics, developers, museum directors and academics, many of whom were happy as long as they could sit at a table with Frank Gehry or Miuccia Prada. This system failed and failed spectacularly. Little of value was produced in architecture, writing, or history.

Architecture theory also fell victim to post-criticism, its advocates too busy being cool and smooth to offer anything of substance in return. Perhaps the most influential texts for me in this decade were three from the last one: Deleuze’s Postscript on the Society of Control, Koolhaas’s Junkspace, together with Hardt and Negri’s Empire. If I once hoped that some kind of critical history would return, instead I participated in the rise of blog culture. If some of these blogs simply endorsed the world of starchitecture, by the end of the decade young, intelligent voices such as Owen Hatherley, David Gissen, Sam Jacob, Charles Holland, Mimi Zeiger, and Enrique Ramirez, to name only a few, defined a new terrain. My own blog, founded at the start of the decade has a wide readership, allowing me to engage in the role of public intellectual that I’ve always felt it crucial for academics to pursue.   

Indeed, it’s reasonable to say that my blog led me into a new career. Already, a decade ago, I saw the handwriting on the wall for traditional forms of history-theory. Those jobs were and are disappearing, the course hours usurped by the demands of new software, as Stanley Tigerman predicted back in 1992. Instead, as I set out to understand the impact of telecommunications on urbanism, I found that thinkers in architecture were not so much marginal to the discussion as central, if absent. Spending a year at the University of Southern California’s Annenberg Center for Communication led me deeper into technology and not only was Networked Publics the result, I was able to lay the groundwork for the sort of research that I am doing at Columbia with my Network Architecture Lab.

The changes in technology were huge. The relatively slow pace of technological developments from the 1950s to the 1980s was left long behind. If television acquired color in the 1960s and cable and the ability to play videotapes in the late 1980s, it was still fundamentally the same thing: a big box with a CRT mounted in it. That’s gone forever now, with analog television a mere memory. Computers ceased being big objects, connected via slow telephone links (just sixteen years ago, in 1993, 28k baud modems were the standard) and became light and portable, capable of wireless communications fast enough to make downloading high definition video an everyday occurrence for many. Film photography all but went extinct during the decade as digital imaging technology changed the way we imaged the world. Images proliferated. There are 4 billion digital images on Flickr alone. The culture industry, which had triumphed so thoroughly in the postmodern era, experienced the tribulations that Detroit felt decades before as the music, film, and periodicals all were thrown into crisis by the new culture of free media trade. Through the iPod, the first consumer electronics device released after 9/11, it became possible for us to take with us more music than we would be able to listen to in a year. Media proliferated wildly and illicitly.

For the first time, most people in the world had some form of telecommunication available to them. The cell phone went from a tool of the rich in 1990 to the tool of the middle class in 2000. By 2010, more than 50% of the world’s population owned a cell phone, arguably a more important statistic than the fact that at the start of this decade for the first time more people lived in cities than in the country. The cell phone was the first global technological tool. Its impact is only beginning to be felt. In the developed world, not only did most people own cell phones, cell phones themselves became miniature computers, delivering locative media applications such as turn-by-turn navigation, geotagged photos (taken with the built in cameras) together with e-mail, web browsing, and so on. Non-places became a thing of the past as it was impossible to conceive of being isolated anymore. Architects largely didn’t have much of a response to this, and parametric design ruled the studios, a game of process that, I suppose, took minds off of what was really happening.

Connections proliferated as well, with social media making it possible for many of us to number our "friends" in the hundreds. Alienation was left behind, at least in its classical terms, as was subjectivity. Hardly individuals anymore, we are, as Deleuze suggested, today, dividuals. Consumer culture left behind the old world of mass media for networked publics (and with it, politics, left behind the mass, the people, and any lingering notion of the public) and the long tail reshaped consumer culture into a world of niches populated by dividuals. If there was some talk about the idea of the multitude or the commons among followers of Hardt and Negri (but also more broadly in terms of the bottom up and the open source movement), there was also a great danger in misunderstanding the role that networks play in consolidating power at the top, a role that those of us in architecture saw first-hand with starchitecture’s effects on the discipline. If open source software and competition from the likes of Apple hobbled Microsoft, the rise of Google, iTunes, and Amazon marked a new era of giants, an era that Nicholas Carr covered in the Big Switch (required reading).   

The proliferation of our ability to observe everything and note it also made this the era an era in which the utterly unimportant was relentlessly noted (I said relentlessly constantly during this decade, simply because it was a decade of relentlessness). Nothing, it seemed, was the most important thing of all.

In Discipline and Punish, Foucault wrote, "visibility is a trap." In the old regime of discipline, panopticism made it possible to catch and hold the subject. Visibility was a trap in this decade too, as architects and designers focussed on appearances even as the real story was in the financialization of the field that undid it so thoroughly in 2008 (this was always the lesson of Bilbao… it wasn’t finance, not form, that mattered). Realizing this at the start of the decade, Robert Sumrell and I set out to create a consulting firm along the lines of AMO. Within a month or two, we realized that this was a ludicrous idea and AUDC became the animal that it is today, an inheritor to the conceptual traditions of Archizoom, Robert Smithson, and the Center for Land Use Interpretation. Eight years later, we published Blue Monday, a critique of network culture. I don’t see any reason why it won’t be as valuable—if not more so—in a decade than it is now.   

I’ve only skimmed the surface of this decade in what is already one of the lengthiest blog posts ever, but over the course of the next year or two hope to do so to come to an understanding of the era we were just in (and continue to be part of) through the network culture book. Stay tuned.

Read more

On Death

I’m usually late in sending out holiday greetings and this year is no exception. We had planned to make a physical version of our annual family photo but didn’t manage to do it in time for the holidays, so we wound up sending out virtual versions. At least there was snow. I sent out the photo to perhaps 150 friends and colleagues and received the usual 20 bounces. One bittersweet surprise was finding out that my friend Daniel Beunza has moved to the London School of Economics. I’m sure it’ll be a great place for him—and he’s closer to his home country of Spain—but I’ll miss discussions about finance with this remarkable colleague. Much sadder was receiving an automated e-mail from Anne Friedman, another friend with whom I co-wrote the Place chapter of Networked Publics saying that she was on indefinite medical leave. I had received this same message a while back and was concerned, but I didn’t get in touch. This time, I looked her up in Google news—just in case—and was saddened to hear that she died this October.

I remember Anne and I talking about how I had discovered that Derek Gross, a college friend who died on 1996 via his Web page. This was before the age of blogs, but Derek updated his Web page regularly and when I visited it to see when his band was next playing, I found he had died, together with a record of his experience. Certainly it’s something I had never wished to see again, but just as surely discovering Anne’s death via the net is not going to be the final time.   

Anne was a brilliant scholar, as evidenced by her books Window Shopping and the Virtual Window, as well as a great friend. She was crucial for not only my chapter, but also for the Networked Publics group and our book, articulating issues that were fundamental to the project, asking and giving me sage advice throughout. I could not have written the chapter of the book without her. Together we sat in our offices, she in her Lautner House, I in the AUDC studio on Wilshire Boulevard, and wrote the chapter simultaneously on Writely (now Google Docs). In so doing, we experienced the phenomenon of our voices becoming co-mingled, producing a third entity that was neither Anne nor myself. I am heartbroken that there will never be a sequel.

Read more

Alternate Scenarios Wanted

British author Charles Leadbetter critiques the “Digital Britain plan” for making broadband ubiquitous, much like the Obama Administration’s own plan. Leadbetter points out that both are flawed because they focus on infrastructure in a narrow way, failing to address the deep transformations that the Internet is making on network culture and economy. Read his response here.

This section is particularly important:

Accelerating the spread of broadband will not save these industries but make their predicaments more difficult. Here’s the truth: plans to invest more in digital technologies will only pay off if they bring further disruption to economies that are already in turmoil. We will know when politicians are really serious about the coming digital revolution when they start to admit that it will have to cause significant disruption to established business models if it is to pay off.

This is particularly tricky in the UK. The implosion of financial services, long the flagship of the services economy, means the cultural and media industries, in which Britain has a strong position, will take on an even more important role.

Leadbetter has this right and what he says can also be applied to the two countries that I work in, the United States and Ireland, but the problem for capital will come in monetizing what he calls “mutual media,” the rising ecology of bottom-up media production.

The problem with this model, also proposed by other authors such as Yochai Benkler and Clay Shirky is that it does not give an adequate explanation of how to monetize such media or how to distribute wealth in a remotely equitable manner (let’s forget socialism for the moment, I’m talking about market monopolies, in particular the inherent power-law nature of networks and how we can have anything beyond Google). Let’s be clear about this: mutual media are incredibly successful not just because we can produce anything we want and upload it, they are successful because it has us producing content for free for corporations.

Make no mistake about it, the day that it dawns on the administration at the New York Times that there are bloggers out there who would work for free, for the fashionable cachet of a byline on a Times column, and that these bloggers are better than many of the Times’s own writers is about two weeks before the entire staff of the Arts & Leisure section finds itself looking for work at Starbucks.

The economy undergoing an unprecedented transition. The owl of Minerva spreads her wings at dusk. Theory once again dreamed its successor era: if in the years between 1988 and 1994 theory seemed to be everything only to vanish, in the years since culture has seemed to be everthing, but on a much vaster scale, forming what appeared to be a new backbone in the economy (even if, as I’ve pointed out, it was finance all along). That’s vanishing now and with it, economic crisis is at our doorstep. There is no way out of this on the horizon. The wealth of networks is not in their ability to promote sharing or interaction, but in their ability to strip away jobs and destroy industries without proposing sustainable new ones.

For anyone who thinks I’m being pessimistic, I do hope you’re right and I’m wrong. Really, I do.

Alternate scenarios wanted. My only caveat is that I we don’t cook the books or take on more Ponzi schemes like the real estate bubble.

On Mad Men

Fellow resident of my adopted hometown of Montclair, NJ and New York Times journalist David Carr has a new piece out yesterday entitled “The Fall and Rise of Media” in which he explores the rapid decline of the (traditional) media industry and makes a case for optimism about new media. It’s a good read, take a look.

Carr puts on a brave face as he remind us that all reigns are temporary. The media jobs being swept away are positions that were obsolete years ago, he suggests, all but invoking Joseph Schumpeter’s “creative destruction” as an up side to the devastation that media outlets face today. As historian Jackson Lears reminds us in his latest book, Rebirth of a Nation, Americans have a longstanding fascination with the idea of rebirth and our own era is hardly immune to.

This struck a chord for me this morning as I had just finished watching the third season of Mad Men last night* and wondered about the show’s future. (spoiler alert!) With the end of the old firm that the Mad Men worked for, would the new firm they would build be nimble and intelligent, able to embrace the changing terrain of the 1960s, a diabolical player in an alternate universe version of Thomas Frank’s The Conquest of Cool? Or is it destined to be wiped out by the juggernaut of sociocultural change that comprises the mid and late 1960s the way Philip Johnson was, at least for a decade? In the atemporal world of network culture, we often forget how commonly we still look backward to find reference points for transformations in the contemporary world. Here I’d identify the popularity of Mad Men today. It offers us a glimpse at a moment of massive, societal transformation, as a relatively comfortable came unglued. Perhaps four decades from now we’ll see a remake of Mad Men set at the New York Times, or at a dot.com corporation. Certainly, it would lack well-designed furniture and well-cut suits, but so it goes.

In his article, Carr points to a new generation of under-30 journalists armed with netbooks, wireless connections, and visions of reshaping their world. Let’s hope so. The dinosaurs were dinosaurs not only because of their attitude and their budgets, but also because of the poverty, our worse yet, the outright fiction, of their reportage (no disrespect to David, but the Times itself often led the way with this: Judith Miller anyone?). No question, it’s high time to renew media. Already the architectural blogosphere is smarter, sharper, and more critical than newspaper critics have been in decades.

But there’s also much to dread and not just for the dinosaurs. Rarely do things go back to normal after a serious downturn. Economic regimes undergo radical changes during recessions, often even more dramatic than during boom times when excess liquidity keeps the status quo well lubricated.

What we’re seeing now, then, isn’t just the disappearance of some crufty old salts from journalism, but rather the restructuring of the creative class. Media is very much at the forefront of this. Faced by the perfect storm of a collapsing subscription base and the decline of the advertising dollar, media corporations have figured out that the losses of income are permanent and made cuts accordingly.

In contrast, architects are flailing about. This doesn’t mean that job losses in the profession haven’t been massive, but the profession has done little to rethink how it operates. There’s little question that we won’t see another building boom the size of the one we just witnessed again in our lifetime (nor do I wish it: there’s only so much economic destabilization we can take!). The downsizing is going to be permanent. The result will be heady competition between young unemployed veterans with serious job experience after a few years in the job force and a corps of new graduates trained in new skills that even those who graduated five years ago don’t have. If my readers want to see me as a pessimist, that’s fine, chalk up my position to a refusal to buy Prozac, but I’ve lived through enough recessions to know that the last few years were a huge anomaly and there’s a price to be paid for the excesses.

Beyond the collapse of the media sector, the very core of the contemporary upper middle class—jobs in media, advertising, real estate, finance, law and other services—faces evisceration, and may well follow the lower middle class into extinction over the course of the next decade. Those jobs are gone now and with them a host of possible commissions for architects. More than that, since the Obama administration’s greatest accomplishment seems to be to have unloaded the word “hope” of any meaning, at this point it seems likely that the shift rightward during the next elections will ensure that cities are deprived of the funding necessary to keep them afloat. Fade back to Mad Men and the early 1960s. It’s at this moment that New York takes a turning point and Mayor Robert F. Wagner sees his city entering into a multi-decade fiscal crisis from which it barely recovered.

Decades from now, will the monuments of the last decade—sadly much inferior to the monuments of the 1950s (where, after all, is our Seagram or Lever? The Standard? Magnolia Bakery maybe?)—remind us of the last days of the Creative Class and the hipster city? In 2029 will Sex in the City be as anachronistic in its depiction of the city as a thriving place for young people, just as Breakfast at Tiffany’s was in 1979?

Or is it possible that somehow the Obama administration will wise up? That he’ll take a cue from Harvard and fire Larry Summers together with the investment bankers that have infected the Cabinet, and insist that America not only has a public option for health insurance but that we’re going to rebuild manufacturing, in some smart, as yet unforeseen way? Heck, maybe the multitude will throw off its shackles and we’ll all live in a Shangri-La of post-Marxist immaterial culture.

One thing’s for sure, though. We’re not going back to 2002. Time will tell who succeeds in navigating through it as individuals, nations, and worlds.

*In general, I don’t have the time to ever watch shows when they first come out so I watch them time-shifted, either on my pitifully small Verizon DVR or on my AppleTV,  Roku box, or sometimes even via Blu-Ray disc from Netflix. I point this out since I want to hammer home how media consumption habits are changing. It’s particularly interesting watching my children, who have never known a world without on-demand or, for that matter, full-time PBS Kids Sprout.

The Immediated Now on Networked: A Networked Book

Networked: A Networked Book on Networked Art is now live.

Produced by Turbulence.org and supported by the National Endowment for the Arts, Networked includes a chapter that I wrote entitled The Immediated Now. Network Culture and the Poetics of Reality.

In this chapter, I suggest that network culture is not limited to digital technology or to the Internet but rather is a broad sociocultural shift. Much more than under postmodernism, which was still transitional, in network culture both art and everyday life take mediation as a given. The result is that life becomes performance. We live in a culture of exposure, seeking affirmation from the net. My chapter explores the resulting poetics of the real from YouTube to the art gallery. To be clear, the new poetics of reality is different from established models of realism, replacing earlier codes with immediacy, self-exposure, performance, and remix.

One distinctive feature of this book is that it is open for comments, revisions, and translations and you may submit a chapter for consideration by the editors. I hope my readers not only read the entire book, but contribute. Many thanks to Jo-Anne Green and Helen Thorington of Turbulence.org for putting up this project. It’s been in the works for a while and is sorely needed. 

I’m excited that the research that I did for this chapter is now taking on another form as it feeds my book on Network Culture. I’ve been writing 1,000 words a day and its moving at a good clip. I hope you enjoy the chapter as a preview, and if you haven’t read the introduction yet, you can do so here.   

Finally, I’ll also confess to another role in the project, which is that the CommentPress system, developed at the Institute for the Future of the Book came in part out of a discussion that members of the Institute and I had after one of my courses three years back. That said, WordPress isn’t the best system for this. I’m dying for it to be ported to Drupal.

 

Read more

Michael Jackson, What Have You Done?

Long overdue… a post on Michael Jackson. 

First, a quote from AUDC‘s Blue Monday:

Individuals … long to become virtual and escape into ether. It is through this physical apparatus that, Hollywood stars, celebrities, and criminals obtain another body, a media life. Neither sacred or living, this media life is pure image, more consistent and dependable than physical life itself. It is the dream we all share: that we might become objects, or better yet, images. Media life can potentially be preserved for eternity, cleansed of unscripted character flaws and accidents – a guaranteed legacy that defies aging and death by already appearing dead on arrival. The idols of millions via magazines, film, and television are disembodied, lifeless forms without content or meaning.

But the terrifying truth is that, although a media image may be eternal, like Michael Jackson, its host is prone to destruction and degradation. Data itself is not free of physicality. When it is reduplicated or backed up to file and stored via a remote host it suffers the same limitations as the physical world. It can be erased, lost, and compromised. The constant frustration of CDs, DVDs, and hard drives is that they don’t last forever, and all data is lost at once. Up to 20% of the information carefully collected on Jet Propulsion Laboratory computers during NASA’s 1976 Viking mission to Mars has been lost. The average web page lasts only a hundred days, the typical life span of a flea on a dog. Even if data isn’t lost, the ability to read it soon disappears. Photos of the Amazon Basin taken by satellites in the 1970s are critical to understanding long-term trends in deforestation but are trapped forever on indecipherable magnetic tapes. 

As you probably know, Michael Jackson’s death caused huge delays on the Internet and even prompted Google to think they were under attack. See here. Jackson’s passing from heavily-modified physical form to pure media was a giant ripple in the Net.

Read more

The Big Nothing

As I’ve fruitlessly tried to get Technorati to update its listings for this blog, it’s become more apparent that the service is in zombie mode. Like many companies today, Technorati has done away with support personnel in favor of having users try to answer each others’ questions in a discussion forum. But that’s hardly of any use anymore as the forums fill with notes that Technorati doesn’t respond to support tickets. Still, how could they? As the economy tanks, there’s no money for firms with questionable business models like Technorati and the server bills have to be paid before little things like functionality are addressed. 

This is hardly meant as a rant against Technorati. In contrast, it strikes me that the "social Web" is imploding. Over at Newser, MIchael Wollf observes that Facebook’s CFO has left and concludes that "The wheels are coming off the bus at Facebook." Things are no better at Twitter although it seems that Google and Microsoft are competing to buy that service so it may have a reprieve. 

In other words, I’m suggesting that what we are seeing is not so much the replacement of old media by new, but the annihilation of both. Marxists have long predicted that capital’s contradictions would undo it and, although I’m hardly optimistic about the prospects of a Red future, it seems like we’re getting a taste of this now. 

 

Read more

On Facebook Self-Portraits

I am fascinated with the forced exposure that social networking sites create. Via Alex Soojung-Kim Pang’s excellent blog The End of Cyberspace, I reached Slate author Brian Braiker’s article on how he finds seeing old images of himself uploaded to the net uncomfortable. From there, I found Euan Kerr’s piece for Minnesota Public Radio exploring the phenomenon of the Facebook self-portrait. This really piqued my interest since I’ve been fascinated by this phenomena since I joined the social network site.

The Facebook self-portrait is a product of network culture that reveals how we construct our identities today. It satisfies the version of Andy Warhol’s rule as modified by Momus: "In the future, everyone will be famous to fifteen people," except that it’s not the future anymore (in fairness the article is 15 years old) and it’s not 15 but rather 150 or 300 people, a typical number in a circle of friends on a social network site.

The Facebook self-portrait makes everyone a superstar, famous for no particular reason, but notable for their embrace of fame. So it is that on Facebook, I see friends who I never thought of as self-conscious take photographs of remarkable humor, intelligence, and wry self-deprecation. The Facebook self-portrait insists upon mastery over one’s self-image and the instant feedback of digital photography allows us this. Not happy? Well, try again.   

Long ago, when I was in high school, I read a book on the Bloomsbury group. I remember that the caption underneath a group photograph in the book (whose title now escapes me) pointed out that even in this über-hip clique, only one member was relaxed, only one understood that the right pose for the camera was a calculated non-pose. Our idea of the self can be read through such images: from the stiff formality of the painted portrait to the relaxed pose of the photograph to the calculated self-consciousness of the Facebook digital image. Each time, the self becomes a more cunning manipulator of the media. Each time, the self becomes more conscious of being defined outside itself, in a flow of impulses rather than a notion of inner essence.

So it was that in reading the first article, I felt that the author missed his friend Caroline’s point when she told him "You can never be too cool for your past." As your images catch up to you in network culture, you have to become the consummate manipulator of your image, imagery from the past being less an indictment of present flaws and more an indicator of your ability to remake yourself.

Read more