Archive for September 2011


‘Post-Digital’

September 22nd, 2011 — 5:06pm

I’ve been asked to talk tomorrow at the Alphaville festival of post-digital culture, alongside Patrick Hussey from Arts and Business. I’m very flattered to have been invited, especially to be speaking with such an illustrious colleague. And since I was, I’ve started to think a bit about what ‘post-digital’ means. It’s a phrase you hear more and more often these days in geek circles.

I suppose I take ‘post-digital’ to mean the condition of being fully reconciled to the disruption brought about by digital technology, and in particular the Web. It is much the same, in that sense, as ‘Postmodernism’, which really means the condition of having overcome the ‘shock of the new’, the culture of high modernism, and having absorbed its lessons. Being post-digital, we are, to repeat my previous post, through the future-shock and over the nostalgia (largely – we can still be nostalgic, in increasingly kitsch ways, for modernism and for ‘digital’). ‘Digital’ in this sense, and lord knows it is rapidly becoming the most bandied-about and meaningless term in the dictionary, refers to the technologies that have disrupted our industries, our communications and our patterns of behaviour.

Being ‘post’ anything usually leads to a period of self-reflexivity and a prevailing irony about ‘progress’ that undercuts conscious artistic ambition. There’s a danger that this urge leads to a degeneration, a period in which the only statements are jokes, in which Rorty’s ‘contingency’ is so much to the fore that it is impossible to be naive. So, we look back on the period of disruption and shock and criticise it as an abnormal era, an era of extremes and wild hopes that could never be fulfilled, however deeply influential it really was (which we can only ascertain from a greater perspective, at a greater distance). This trend is evident already in many people’s analysis of the ‘digital’ era, and especially the so-called Web 2.0 period. I do it myself, all the time, pricking the bubble of technological utopianism that blew up in the early-mid naughties and whose chief inflaters were people like Kevin Kelly and Chris Anderson. 

The fact is, that, insofar as it is meaningful to talk about the ‘digital’ in this way, we are still very much intellectually in its grip. And will be for at least the foreseeable future. But what is noticeable is that instead of simply evangelizing about the wonders  of the Web or shying away from it, sticking our fingers in our ears and whistling and hoping it will all go away, the questions are getting much more sophisticated and more practical. Instead of asking what is an API and why do I need one? people are much more likely to be asking, what is the right way to build open APIs, how will the use of each social network help me to fulfil specific goals? and so on…

In other words this stuff is just normal now. It’s becoming just as much a part of the planning process as print marketing, or programming. Or, to put it more accurately, digital questions are being considered at the same time as, and alongside those other fundamental questions. They are intertwined with them. The process of entering the ‘post-digital’ realm then is really a process of acceptance and integration, or digestion. Something that was first viewed as radical, alien and even threatening has now been internalised. Nixon declared ‘we are all Keynesians now’ (would that we still were), but it seems more appropriate now to say that ‘we are all (whatever the collective noun for Vint Cerf’s disciples is) now’.

But in addition to this sense of being ‘okay’ with the arrival of disruptive technologies, I think the term ‘post-digital’ also implies a whole set of other attitudes and characteristics, largely born out of the rhetoric about ‘democratisation’ that has reared its head again since the Arab spring. Notions of real political progress and transparency have been elided with much more superficial ideas about the scalable nature of open platforms and widespread use of social networks in this catch-all term. Post-digital implies a certain allegiance to the ideas of openness, interconnectedness and community that sprang up in the early days of the Web.

It implies a lack of attachment to existing hierarchies and infrastructures that have defined 20th century industries and distribution systems, a capacity for being fleet-of-foot, nimble, interested in process and criticality as well as social engagement (from an artistic point of view), and, interestingly, multiple or collective authorship. It seems to be, at its most engaged, against, or rather critical of, the mass media, the production line, the institution. Yet it is still entranced or persuaded by long-cherished ideas of individual artistic genius and talent.

Anyway, this may all be navel-gazing, but it’s the stuff that’s swirling round in my brain at the moment, and some of it is probably likely to exit through my cake-hole tomorrow. So watch out.

 

Comment » | Uncategorized

Attention Deficit

September 20th, 2011 — 11:32pm

‘Culture is the formation of attention’ – Simone Weil

One of the things that is changing most dramatically in the way that people now experience art and culture online is the amount of time they are prepared to commit to it. The Web, because of its interconnectedness and its vastness, makes it easy to move from one thing to the next, to compare and contrast, to graze from one site to another, to another, to another. It does not make it easy to concentrate on just one thing for long periods of time. The on-demand model of content distribution and the box-set have removed the need to wait for anything. There’s no need to go through a whole week waiting for the next instalment, no need to delay your gratification. You can have it now. And why shouldn’t you?

And of course protesting that it is just good for the soul to have to wait occasionally is a feeble argument. But if I cannot wait, if I cannot bear the suspense, or resist the temptation, does that mean that I will no longer be able to commit myself to something that might take a long time to unfold? Does it mean that I will be less prepared to make the investment in a novel or a thesis, or something that requires sustained attention? And will there ever be, therefore, online texts or narratives that can sustain that same degree of concentration, that can hold me for as long as a good book? Until there are, perhaps we cannot really speak of an online culture. We have learned to invest effort in reading long novels or watching feature-length films or visiting exhibitions at least partly because we have been taught that it is worth putting in that effort, going through the pain barrier, for the sophisticated rewards we get from these cultural experiences.

I was reading Matt Locke’s blog over on Storythings when this question really came into sharper focus. Or rather these questions, because there are at least two: how do you grab people’s attention? and once you’ve got it, how do you hold onto it? I think the first question is much easier to answer, and has been addressed in various ways by marketers, promoters, experts in social media and advertising and so on. You can quite easily (assuming you have money or access to ‘celebrities’ or other influencers) get people interested in something. But in order to sustain their interest for a long time, in an online narrative or work of art, there are many different considerations, most of which haven’t yet been fully understood.

Anything that is online has to hold its own against the galaxy of other sites, links, videos and nuggets of information that are just a click away. The competition for attention is overwhelming. And yet people do devote long periods of time online to gaming, playing in virtual worlds and artificial environments, and so on.

A great novel holds our attention for all kinds of reasons, reasons that are very complex, some to do with structure and plot, some to do with voice and character, some to do with what Wallace Stevens called ‘the surface of things’, and so on…. We want to find out what happens next, but we also simply enjoy entering the imaginative world of the story and deepening our relationship with the characters. The writer has learnt how to enthral us, drawing on a huge tradition of literary works that demonstrate what can and can’t be done, what works and what doesn’t.

I think the anxiety I feel about this is a misplaced anxiety, for much the same reason that the anxiety about the possible death of print (much overblown) is misplaced. It is misplaced because I am expecting the same thing from the internet as I get from books and other existing cultural forms – the exhibition, the feature film, the concert. I have got over the future-shock of disruption, but I am still trapped by nostalgia. The fully-realised forms of artistic work that the internet will allow have not been created yet. The thing that I am worrying about is in fact this reluctance to let go of the cultural expectations that I have grown up with. I want novels because that is what I am used to. There is a distictive pleasure in a novel that I do not want to relinquish. And, more than that, I want it to be preserved so that other people will enjoy it, too.

I don’t think there’s any real danger of the novel dying out as a form of storytelling in the near future. For me it does serve some particular need that I have to experience a story in a particular way (and the rapid growth in ebook sales seems to confirm that there’s really no end in sight for ‘long-form’ writing just yet). I relish the well-crafted sentence. It is too deep now for me to break or forget that addiction. It is sensual. I have trained myself to read novels. The question I should be asking, though, is more like this: ‘will the internet provide new ways of telling stories that can capture my attention and hold it for long periods, and which provide similar cultural rewards and pleasures to those of a well written novel, or a play, or an exhibition of paintings or sculptures?’

I have an enormous confidence that it will, in time, do just that. But in order for the artists who will make those works to invent their new stories, or their new installations or images, they will have to free themselves of the expectations and the conventions of the existing forms. Or at least they will need to understand how those conventions and expectations can be adapted to these new kinds of work. Perhaps the ‘novels’ of the future are really going to be much more like video games, in terms of their presentation, but they will be marked by the knowledge of writers about characters, stories, speech, relationships, and so on. Instead of the kinds of games we have now, things like Grand Theft Auto or even LA Noire, we will see more work like Dear Esther, by Dan Pinchbeck.

And perhaps one day soon writers will really start to see how stories can be told in a way that links many different sites, platforms and media all together, in a web rather than a teleologically straight line. These stories, perhaps inspired by the ‘Garden of Forking Paths’, will start to make full use of the networked, immersive nature of the Web, to understand its grammar. In fact, this process has already begun, the first steps have been taken, and we get closer and closer to the moment when the first great storyteller of the internet emerges. It may even have happened already, known to some small coterie or network of friends or admirers, but not yet broken through to a wider audience and consciousness.

I do know this: when those stories start to arrive I won’t be able to put them down. And I will start to overcome my baseless fears about the death of the novel.

Update – I was re-reading this post yesterday and it occurred to me that I hadn’t properly developed one of the central points I was trying to make. In retrospect it seemed to be staring me in the face. I said that the internet is full of distractions, which makes it harder to concentrate on one individual story. But what I didn’t go on to say, and should have, is that that implies that in order successfully to hold a reader’s attention, online stories may have to cut the reader off from those distractions. Or rather, shutting out some of those distractions puts the storyteller at an advantage when it comes to holding the attention online.

Try watching a show on iPlayer (it actually works even better on ITVplayer, but that’s so buggy I don’t want to put you through it) in the normal/windowed player. Now try watching it  in full screen. This isn’t rocket science, by any means, but it just becomes very clear how the experience of paying attention to content within a frame, within a page, within a tab, within a browser window, which may also have many other tabs open at the same time, is very different from watching it when it’s all there is to look at.

Apple seems to be moving  ever closer to the edge-to-edge approach with its screens, removing or shrinking the bezel at the edge of the glass so that all you can see is the film, or the photo, or the app you are working in. They did this with the transition from Quicktime 7 to Quicktime X (incidentally culling several advanced and useful features at the same time). But the point is Apple has understood that if you want people to become engrossed in something, you have to make that the only thing they can see, or hear, or touch. (I wonder, idly – and does one ever wonder in any other adverbial way? – whether it is this single consideration, this desire for pristine, immersive visual presentation, above all others, that was the genesis of the iPad and continues to drive the development of iOS? Who knows?).

So although I think the very best stories will always hold their own, always grip us and draw us in, we make it a lot easier for them to capture our full attentions if we also block out the surrounding noise, hide the forking paths that our browsers hold open for us, leading perhaps to something just a little bit more exciting, or more tempting, or greener.

Comment » | Uncategorized

The semiotics of the corporation

September 13th, 2011 — 11:04pm

Okay, I know, it’s a very pretentious title. But what am I supposed to call it? That’s what this post is about.

At trade fairs and conferences, and everywhere I go these days I notice more and more how corporations use language to describe themselves, and there’s a very specific vocabulary and register that they all employ. It has a kind of deadening effect, because it just sounds like dull blather. It is as if they are deliberately trying to limit the scope of language, to rein it in, so that their spokespeople and employees can only use certain words and phrases and styles. So that they can only parrot the ‘values’ and idealised self-image of the company in words that have no weight and no connection to the world.

And of course that is exactly what they are doing, and this is a well-developed phenomenon. But it goes beyond ‘house-style’ and ‘Plain English’ these days, until it infects every pronouncement, every statement, every single sentence that every employee utters. And the truly worrying thing is that unless you are on your guard against it all the time, unless you fight it and care passionately enough about preserving your own, unrestrained language it can even make its way into your personal life, and from there into your thoughts. Before long, the language in which the corporation enfolds itself replaces, supplants the language you would naturally use to describe your experience of that company or your feelings about it.

Forcing your employees to use a house style or branding guidelines or ‘plain English’ is already an attempt to control their thoughts, a horrible and insidious effort to restrict their creative use of language which ought to be made illegal. But the further encouragement of a culture in which everybody talks the same way and thinks the same way is not just damaging to individuals, it is dangerous because it encourages institutional thinking, destroys innovation and traps people in a straitened world of unimaginative repetition. Isn’t the very idea of a corporate ‘culture’ oxymoronic, if not thoroughly tasteless?

Corporate speak obscures meaning, to the outsider, but has a very precise meaning for initiates. It is a mixture of the grandiloquent and the banal. What I mean by that is that it aggrandizes the corporation, gives it a virtually human or even spiritual status, as though it is somehow more than simply a legal and financial entity, flatters the power of the corporation (as if it should only be spoken about in a certain way, using certain words), while at the same time making it seem utterly normal and therefore inconsequential. It says ‘thou shalt not question’, or rather ‘thou need not question’, because it is so pre-ordained, so fundamental, and so lacking in danger or deviancy or passion.

It co-opts words which previously had their own, quite different definitions, and turns them into something else. I am not just talking about the use of euphemisms. That has been going on forever (and not just in business). It is the development of a whole new idiolect, and a new lexicon, and a new way of thinking, that suddenly becomes damningly apparent when you are in the middle of a big sales conference.

And I am not just talking about the use of jargon. There has always been jargon. Some of it is funny because it is plainly ridiculous or unnecessary, some of it is boring, but jargon, at bottom, has a specific technical use.  As long as it is used appropriately, it isn’t such a problem. The real problem is the deliberate (but sometimes involuntary because unanticipated) control over all uses of language that our companies and institutions want to exert over their employees and their customers. It is the corporation’s attempt to limit the ways in which you can think about it or speak about it. This is the very essence of branding.

Some of the more intelligent companies have tried to soften their image by returning corporate phrases to simpler, more authentic-sounding ones – e.g. ‘Human Resources’ to ‘People’, ‘marketing’ to ‘engagement’… But this is only a transparent attempt to gain control of the language again, in a subtler way. It could be compared to the way in which brands in the 1980s and 90s adopted the iconography and language of the various subcultures. It’s a camouflage, but not a very good one. It makes them look smooth, sophisticated. Capital transforms itself into any shape that pleases consumers. It is the ultimate shape-shifter, but if you look closely you will never be fooled by it. There is a deathly, fake quality to it. It is like somebody who tells a joke but doesn’t quite time the punchline right.

So, in semantic terms, there is an ongoing re-definition of the meanings of words and phrases to suit only the corporate environment, and a continuous corruption of personal language to turn it instead into sales speak. The signified has changed, even if the signifiers haven’t. Cliché suddenly infects your conversations because you have been trained, consciously or not, to describe things in a particular way, to use certain nouns and phrases (and usually to excise all adverbs), to employ certain images or figures of speech. If you work in a large corporation or a bureaucracy the effect is pronounced, but simply by imbibing advertising and ‘messages’ all the time you catch the virus anyway.

In syntactic terms, the structure of sentences and phrases is also under attack. You are constantly encouraged not to write complex sentences, not to use the passive voice, not to entertain ambiguity. You are supposed to use dashes when you should use full stops, to use commas where there should be colons, not even to use capital letters. You are told to use only certain phrases or to coin them only in a way that has been approved for its ‘consistency’.

And in terms of pragmatics, you can only speculate as to the impact this has on the ability to think, speak and write in an original or interesting way. How can you escape received ideas if your entire vocabulary is built out of them?

Shakespeare writes: ‘The forms of things unknown, the poet’s pen/Turns them to shapes, and gives to airy nothing/A local habitation and a name.’ But how will we imagine or uncover the form of  ‘things unknown’ if we have no room to think? Resist the clinging nonsense of business language, plain English, branding books and sales pitches because if you don’t you may soon find that you are unable to think imaginatively and say what you want to when it really matters.

Update: I know I should have included some examples of the kind of language and control over language that companies and organisations increasingly try to exert. For a long time I was casting around for the perfect illustration of what I meant, and then out of nowhere comes this:

http://gizmodo.com/5938323/how-to-be-a-genius-this-is-apples-secret-employee-training-manual

Apple’s own secret training manual for its staff – its Geniuses. If you thought I was being paranoid, or exaggerating, just look at the ways in which the world’s most valuable company tries to shape not just the vocabulary and speech of its employees, but their whole behaviour and attitude. It is quite sobering. The level of obssessive micro-management of expression, to the level of individual words, is amazing. Never say ‘crash’, or ‘bug’, or even ‘hot’ when referring to an Apple product. Extraordinary. Void language of meaning, massage it so that the customer, and ultimately you yourself cease to think about computers or devices as fallible things. The possibility that they might not work leaves your mind because the language of failure, the vocabulary of technical problems, has been conditioned out of you.

It does work, in the sense that it helps Apple to sell a lot of computers and tablets and phones, but it also denies their employees the ability to think and speak for themselves, and that, to me, is disturbing.

1 comment » | Uncategorized

Back to top