Archive for April 2012


the half-life of a tweet

April 16th, 2012 — 6:21pm

I’ve been thinking a bit recently about the nature of time and persistence in digital media. Most tweets evaporate from the mind (if they are read at all) very quickly – within a few moments, or perhaps a day, or a week at most. Their impact on the consciousness of the reader is ephemeral (and what the marketeers of the world wouldn’t give to know just how long they last, which sodium channels they aggravate or unblock, which neuronal clusters they re-arrange or make deposits on). Ultimately, their half-lives in our attention are minimal. This is not to say, by any means, that they disappear completely. Anything that has been indexed by a search engine takes a very long time to die indeed. Instead it pools, along with all the other flotsam and jetsam of the web, into a great polluting mass, deep in the server farms of Mountain View or Sunnyvale. It is very much like the floating dead-zone of plastic in the Pacific ocean, and, just like that other collection of junk, it is growing all the time.

But unlike tweets, or status updates, or individual comments in forums or on message boards, there are things, digital things, that can mark us profoundly and go on to have a long-lasting effect on us. On the way we think, or read, or write, or create things. It is interesting to note the growing trend for arcana or anachronistic content on the Web. Among geeks there is a reverence, almost an obsession, with the very early history of the Web, and with its pre-history. Not just as a matter of asserting different interpretations of what it meant, a kind of dry historiological debate, but as an archaeology of the material and culture of the time. This is related I think to the rise of steampunk, the retro-outfitting of technology to make it look old even when it is new. The eras of the internet age pass in months, weeks and days. New cultures, sub-cultures and ideas, groups and crazes spring up all the time and quite soon rise to replace what went before. So, rescuing or reviving a particular stratum takes on a certain importance. And it is becoming increasingly difficult to do.

Geocities is a good example of this. It was a platform supported by Yahoo in the early-mid 1990s. It was one of the first widely-available domains in which people with very few skills in web design (the general public) could actually build their own websites. It was wildly popular. Most if not all of the websites that were created, from a purely aesthetic view, were tasteless and commonplace. This shouldn’t have come as any surprise – you only need to have a passing acquaintance with interior design to know what happens when you let people loose on their own domiciles.

But now, in the formulaic era of Facebook, when everybody’s individual profile looks exactly the same, Geocities is seen as a kind of noble savagery, and we have all become ardent primitivists. We enjoy the naivety of Geocities, the lack of regimented, controlled space. We like the fact that it does not follow a template. We like its honest-to-goodness tackiness. Geocities has become, for some, a symbol of the freedom of a more honest, less cynical time. It is a cause celebre for those who believe that the internet now has become a giant machine for corporate surveillance and for turning our personal creativity into stilted, tabular networks and ultimately into profit.

Before there was Geocities, there was Usenet. Usenet was not even a website as we know it today. It was a series of ‘newsgroups’, essentially message boards where users could post messages and content, which could be accessed through a reader. Usenet had none of the visual cutesie-ness of Geocities, but it did have the same sense of being beyond the control of ‘the man’ and of being driven first and foremost by like-minded individuals connecting and sharing instead of some huge company. In fact, some activists have recently advocated a return to a type of Usenet-based internet layer, to evade the kinds of control, monitoring and blanding-out of the Web that seem to be encroaching on us ever more rapidly.

Usenet declined, without much fanfare, from 1993 onwards, but it was rescued (in archived form) to great applause by Google in 2001[1]. Geocities and Usenet are important for two reasons: first, they are important parts of the history of how the internet became what it is today, and second because they represent alternatives, alternative models, alternative presents, that have not come into being. They are appealing because, however crude or whatever faults they had, they are different from what we have today. But the point is that they both were superseded very quickly, and have only recently been archived or preserved, in part. They made a lasting impression on the people who used them, who now feel a sense of regret and nostalgia that they are no longer with us.

If Moore’s law holds true, and processing power and bandwidth and memory continue to increase and improve exponentially, then our devices will change, and our software will change, too. It might change more rapidly than ever. Perhaps Apple Inc’s homogenizing effect on these things is only a temporary event, and in fact the Long Now will see periods of great diversity and fragmentation of tools and operating systems. Perhaps, sooner or later, everybody will write their own, unique software. And then be written by it.

The question for me is, does this ephemerality mean that the Web can only do things that are short-lived? Is it a permanent condition of the digital object or text to be transitory?

I don’t really believe that the human attention span is capable of being significantly shortened by twitter. What is more probable is that the competing demands for our attention, which have recently exploded, impose smaller and smaller windows of attention on us. We have to divide our time more parsimoniously among them because there are so many more things to look at or read or respond to. And if this trend continues, there is every reason to expect that things which reward longer and more reflective forms of attention will be increasingly squeezed.

On the other hand, I don’t really think that things like the 10,000 year clock are the answer. We can’t deliberately slow down the pace of exchange and extend the patterns of attention, or deepen the resonance of our communications, by arbitrarily setting up very long-term projects with vast timescales. What we need to try to invent are the encoded experiences[2] that repay greater levels of attention or that mark us in a more profound way than a 140 character message can.

These inventions are already being made, by writers, artists, game designers, film makers, musicians and others, but it still feels as if something is missing. What I suspect will happen increasingly is that what we think of as games now will evolve in the direction of what we think of as novels and films and theatre and dance. Of course this is already going on, to some extent as the result of deliberate experimentation, and to some extent because gamers are not ‘playing the game(s)’ but playing with the games.

I think it’s most likely that the conventions of these different practices and artforms will be elided further and further over the next few years and that what will emerge, ultimately, may look a bit like a game, but will put the kinds of demands on us that great art does, not the kinds that more traditional games have done. Games already capture our attention for long periods. They just haven’t tended to do very much with it, or know why they should do anything different with it in the first place. Art has always known this, and in fact it might be one of the distinguishing features of art that it does aspire to leave its marks very deep in us. If these two forces can be combined, I am sure that the result can be enormously powerful.

As for twitter, its beauty is perhaps in its evanescence, the fact that tweets disappear so quickly. It is not just about now, it is about this exact moment. When the moment is gone, it loses some of its power. Twitter is perfectly adapted to provide the feeling of constant interconnectedness that is such a strong mirage of the internet. If tweets lingered longer, they would be dangerous, embarrassing, they would become something laborious, studious. But as they are they give us a perfect feeling of immediacy, and constantly cause us to forget what happened five minutes ago. Twitter will be superseded in time, but for now it is gloriously suited to the chopped-up attention of our generation[3].

 


[1] – in fact, archiving began in 1995, carried out by Deja News but Google acquired the archive in 2001

[2] h/t James Bridle

[3] it might be interesting to compare the demographic (age-range) and internet use of twitter users – (older, professional, often very familiar/professional users of digital technology), with those of other services like BBM (younger, often from lower socio-economic groups, the misconstrued digital ‘natives’)

Comment » | Uncategorized

Back to top