Alex Steer

Better communication through data / about / archive

Kindles, iPads, and medieval readers

1474 words | ~7 min

Blogging's been a bit derailed by work, but I promised something on the history of the book in the cultural imagination, and in particular about a couple of points when existing ideas about what books mean were transformed.

As I mentioned before, there's been a lot of talk about the factors forcing a rethink of what written matter looks like and how it's used. Books, magazines and various other historically printed matter (dictionaries and encyclopedias, for instance) are shifting online. They have been for some years, but the pace of change has started accelerating as new structural and physical formats for written matter have been developed (both hardware like e-readers and encoding systems like the work of the TEI) together with new distribution and revenue models, from paywalling to The Domino Effect.

What books mean in culture

These are the physical and infrastructural changes. But since the physical form and infrastructural mechanisms of books and publishing have been reasonably stable for so long, the book as an object has built up a huge cultural back-story. The bound, printed volume has a whole network of associations bound up with it. Those associations form part of the reason academics, especially in the arts and social sciences, regard the book-length study as the most definitive kind of work; or why financiers feel the day starts incompletely without a newspaper under the arm; or why so many religions as they are now practised rely on the exchange and circulation of heavy bound paper objects (now a rather easy task in many parts of the world; still difficult and even fatal in others, as in the past). In a very broad sense we are people of the book, and much of our cultural life has been conditioned by printers' economies of scale or the limits of binding glue.

And then, in July 2010, Amazon announced that it had sold 143 ebooks for every 100 print books in the previous three months, and we realized that digital text might not just lead to the generation of new literary forms (the blog, for example) but eventually to the extinction or extreme alteration of old ones. I still remember the genuine shock that used to come from many people when I would suggest the possibility that there might be no more print dictionaries (at least for UK readerships) within a few years.

Eventually this disruptive shock will generate cultural transformation as new written forms become part of the toolkit with which we think. The blog would have been difficult in the age of print - only the diaries of the famous or notorious were ever published - and the tweet would have been impossible, except as graffiti or marginal notes. Put like that, the scale of the cultural disruption begins to make sense. No wonder dictators and CEOs alike struggle to internalize the speed and reach of the web. Forget streaming video or mobile telephony. Words were never committed to paper so fast.

Western Europe has been in this situation before: what you might call disruptive bibliographical shock. A confluence of factors over the course of the twelfth and thirteenth centuries rearranged our cultural furniture by putting pressure on the status quo of the book, even then one of the most powerful ways of transmitting information and ideas and making culture. I won't go into them in detail, but I'll give a sketch.

Book futures with medieval monks

If you'd been a reasonably learned clergyman thinking about the future of book production in western Europe at the start of the 12th century, you might have fallen victim to the same kind of bubble thinking that sometimes trips us up today. To you, literacy would have meant Latin literacy, the copying and annotation of theological, philosophical and occasionally scientific texts in Latin. The twelfth century was a boom time for that practice of book production. The reconquest of parts of southern Europe from Arabs had allowed Christian Europe sudden access to a large wealth of Arabic texts and a decently-sized bilingual and literate population to help find and translate them into Latin. As well as the vast reserves of Arabic knowledge were many texts originally translated into Arabic from Greek, a language lost to the Western Roman Empire and its successor states. So a huge bulk of ancient and late antique learning, from the Church Fathers to Ptolemy's astronomical work, was suddenly up for translation. You could forgive the Latinists for feeling a bit like the film studios when DVDs replaced VHS, or like publishers no doubt feel now as they rush to reformat their back catalogues for the Kindle.

Historians used to call it the twelfth-century Renaissance, and sometimes they still do.

But drivers of change are complicated things, and one largely unforeseen consequence of the sudden boom time in translation and transmission of Latin texts was a boost to the profile of book-making as a cultural activity. No longer was book production the preserve of monasteries. Literacy and learning were in vogue. We know this from the unseemly competition among royal courts in western Europe to attract as many scribes and literate men as possible. The clear winners was the Anglo-Norman King Henry II, and his wife Eleanor of Aquitaine, who built up impressive networks of scribes, book-makers and authors around them, producing new works as well as copying and studying old ones. In the early years works on science were all the rage, including a surprising number of books on timekeeping.

The boom time of the 12th century wasn't just confined to books, though, and royal courts weren't just centres of learning. Like most seats of power in good times they were centres of fashion, passion and politics as well. They were also, for the first time, overstocked with powerful aristocratic women who had vastly more leisure time than their mothers or grandmothers had enjoyed, in part a consequence of greater political and economic stability in France. They were also far more likely to be able to read - but not in Latin.

Anyone who works in publishing knows a demand shift when it happens, and this was an unprecedented demand shift. The wealthy, leisured populations of the royal courts didn't want to read about timekeeping, intricate theology and maths. They wanted something a bit more Jilly Cooper and a bit less BBC Four.

The shift happened slowly at first. The first signs were the production of more Latin treatises - except these ones weren't on astronomy, but on how lovers should carry on at court. (Notable examples include Andreas Capellanus's On Love and Walter Map's On the Trifles of Courtiers.) Being in Latin and full of clerical humour, these first efforts were probably as much pieces of wry social commentary as they were an attempt to meet changing demand, but they would have managed a bit of both, at least for male Latin-reading audiences.

Romance is in the air

But the big, decisive, disruptive shift was the production of works of literature in Western Europe's vernacular languages, not in Latin, for the first time, sponsored by these powerful royal courts. Designed to meet the demands of a new audience, in terms of content as well as language, it's unsurprising that the catch-all term for all those not-quite-Latin languages (which we still use today) became a byword for racy exciting fiction.

They called them romance. The French still call book-length works of fiction romans (and we call them novels).

It was a kind of writing that didn't exist before, in languages like French and Occitan that had had very little tradition of book-making. Over the next century or so it would spread prolifically, driven by demand, surrounded by moral panics and even some rather clever PR attempts to lend religious instruction some of romance's excitement (probably the main driver behind the equally meteoric rise of vernacular saints' lives between the twelfth and fifteenth centuries, going head-to-head against romance for the hearts and souls of aristocratic readers).

Unlike the present shock to the system, the big drivers of change weren't technological - the form of books didn't change dramatically - but social and economic and linguistic. Still, the change shifted permanently the idea of what books were for, and introduced the idea of fiction in the West. So your holiday reading probably owes something to Henry II, his wife, and their set of well-connected writers.

This has gone on more than long enough, but sometime soon I'll talk about the 16th century, and a series of more technological shocks to the system, mainly print.

# Alex Steer (08/03/2011)


The future of (the idea of) the book

630 words | ~3 min

There's been a 'kaboom' moment in publishing journalism, probably because there's been a 'kaboom' moment in the history of the book. Newsweek offers quite a deft summary of the big factors changing books and publishing, and the changes themselves. They may not all be surprising, but the scale and speed have suddenly reached a decisive rate.

Those with an interest in the history and future of the book (let me point you towards James Bridle and Ben Hammersley, for instance) have been tracking the drivers of change in publishing, and the early expressions of a seismic shift, for several years now, but a lot of that remains under the radar of public attention. Futurists (think strategy consulting rather than crystal ball-gazing here) often use the 'seasonal' classification to think about awareness of change. I like it, I use it quite a bit, so I'll use it here. In essence it's a nice piece of shorthand, and it goes like this:

  1. Spring: Fringe issue. Only picked up on by specialists, scientists, radicals, and similarly esoteric and out-of-the-way sources.
  2. Summer: Specialist issue. Discussed in subject-specific sources such as journals, conferences, blogs for expert readerships.
  3. Autumn: Accelerating issue. Turns up in newspaper articles, popular magazines, blogs, TV shows, etc. as a widely-discussed new thing.
  4. Winter: Mainstream issue. Regularly featured in government documents, corporate strategies, etc.

(Side note: I love that government reports are the benchmark medium for issues as they enter the 'duh, well, obvious' phase of their emergence.)

It feels right now like the challenges to the established forms of the book are moving from Summer to Autumn, and we're starting to take them seriously. Most of the discussion, in pieces like the Newsweek one, are about direct challenges to form, economy and use: what books are, and how we buy and read them. Newsweek's talking heads are also smart enough to know that challenge doesn't equal overthrow when it comes to media forms (see my many rantings on the 'Twitter exists, therefore email is dead' type of fallacy), and that there's no reason to assume that paper books don't have significant vitality left for certain functions.

So let me float another question, one that may still be hovering in Spring. What will the emerging challenges to the form, economy and use of the book do to our idea of the book as a meaningful unit within culture?

If this sounds like a crazy question, there's a good reason for that. The form of the book had a reasonably (though not entirely) easy time of it between about the eighteenth century and the mainstreaming of the web in developed countries about fifteen years ago. It's now taking a bit of a beating, but not for the first time. I don't have time now, and this post is long enough (an idea which raises questions of its own, I know), but over the next few days I'll aim to write one or two more posts which look at a couple more flashpoints in the history of the book - specifically, in Europe in the 12th and the 16th centuries - and offer a way of thinking about how cultures and books make sense of each other.

Meanwhile, I'd love to get a bit of a thread going on the future of the book, one of my favourite topics. Please do comment, trackback, tweet (I'm @alexsteer) or similar if you have thoughts on the bigger implications of the changing nature of books. Thanks!

# Alex Steer (06/02/2011)


The internet, criticism and the culture wars

735 words | ~4 min

Malcolm Gladwell, mid-way through either stating the obvious or missing the point in a New Yorker piece on social media and activism, writes:

We would say that Mao posted that power comes from the barrel of a gun on his Facebook page, or we would say that he blogged about gun barrels on Tumblr—and eventually, as the apostles of new media wrestled with the implications of his comments, the verb would come to completely overcome the noun, the part about the gun would be forgotten, and the the big takeaway would be: Whoa. Did you see what Mao just tweeted?

To me the most important word in that paragraph is 'whoa'.

'Whoa' is not a word you put in the mouths of others if you want to portray them as serious and independent-minded thinkers. Nor, let's be honest, is 'apostles'. Speaking of long words, rhetoricians (there's one) used to call this tactic 'prosopopoeia' (there's a second) - speaking in the voice of another, to make a point of your own. Gladwell's point here, I think, is that social media users are essentially shallow and vacuous.

While pausing briefly to ponder the vacuities that print is able to bestow upon the world every day without anyone blaming the medium (yes, McLuhan, yes), this derisory 'whoa' speaks volumes about the ongoing vitality of the old debate about 'high' vs 'low' culture.

Only a few days ago, the Observer's Culture section went all 'O Tempora! O Mores!' about the shift in power from professional critics to amateur criticism. Unsurprisingly, social media was roped in for a whacking:

The real threat to cultural authority turns out not to be blogging but social networking... The point isn't that the traditional critics are always wrong and these populists are right, or even that these comments are overwhelmingly negative or invariably take on the critical consensus. More often than not, they aren't and they don't. The point is that authority has migrated from critics to ordinary folks, and there is nothing -– not collusion or singleness of purpose or torrents of publicity - that the traditional critics can do about it. They have seen their monopoly usurped by what amounts to a vast technological word-of-mouth of hundreds of millions of people.

Those familiar with the history of debates on the function of criticism, particularly around the rise of English studies in the late 19th/early 20th century, might feel the icy breath of T.S. Eliot ('breathing the eternal message of vanity, fear, and lust', maybe) and others at this point. If you're not familiar, I recommend Chris Baldick's The Social Mission of English Criticism, or virtually anything by Stephan Collini. This is not a new story - mass participation is the death-knell of criticism, as surely as home taping kills music.

There is no doubting which side of the high/mass culture division the internet falls on. What's doubtful is the continued insistence that mass culture has to be low culture. (If you don't think you're prone to this, go on, ask yourself: which is really better, Classics or media studies? If you have an answer off the top of your head, either way, you're prone.) Look all over the web and you will find thoughtful, sustained engagements with the future of media and culture that make many 'function of criticism' arguments look paltry and narrow. For all the dumb, there is plenty of smart - and, whereas much high cultural aesthetics invites you to pay no attention to the man behind the curtain, the best critical theories of mass culture ask you to understand the smart in the context of the dumb. Not just with the aim of appreciation of the smart (as in many aesthetic formalisms), but with the aim of understanding various cultural systems as wholes (or interlocking sets, or whatever - pick your metaphor), and perhaps even of engaging and (dare I say) improving, of helping the smart outnumber the dumb.

It's enough to make you say 'whoa'. And, thanks to various miracles of technology and economy, it's in all of our hands (quite literally; or pockets). Which means it may be time to take criticism seriously as a discipline. Again.

# Alex Steer (03/02/2011)


Media and revolutions: Microscopes and megaphones

333 words | ~2 min

The current uprising in Egypt has prompted a lot of talk about the role of social media in political protest. The most sensible comments have been to the effect that, no, it's not an enabler, but yes, it can be an accelerant.

What about the relationship between 'social' and 'mainstream' media? Here are my thoughts. Take them to bits and see if they work.

At its best, social media provides a microscope. It plays the same role foreign correspondents traditionally play, giving a close-up view of what's happening. Social media provides many correspondents and allows for the capture and rapid sharing of lots of detail.

At its best, mainstream media provides a megaphone. It takes the important stories and gives them scale.

At their best, together, the microscope feeds the megaphone. Take Egypt. Without the microscope, today's news headline would be Rival factions battle in the streets of Cairo. The microscope shows that the pro-Mubarak faction is a rent-a-mob of thugs and cronies. As a result, the megaphone tells a better story. Or should, at least.

Between the two there has to be a filter. The microscope is notoriously bad when it pretends to be a megaphone. Easy me-too functions like retweeting are great for spreading news, but also great for giving isolated observations an undue sense of scale (which is, after all, what microscopes are for). The megaphone is just as bad as a microscope, given the pressures of time and money which hinder true investigative journalism.

The filter is the journalist - at least, what journalists may be becoming. Not the sole unearther of facts and pursuer of leads, but the skillful observer of everything the microscope reveals; the maker of connections; the person who finds the story, and knows how to set it up for the megaphone. If professional news has a future it may be through this analytical function. Rehashing press releases is, after all, little more than retweeting.

# Alex Steer (02/02/2011)


Getting economic history wrong with Google NGrams

267 words | ~1 min

Listen very carefully, I shall say this only once (or more). Google Ngrams is lots of fun, but it is next to useless for analysing the history of language, or of ideas.

I'm talking to you, New York Times Economics Blog. An otherwise sound blog post on the flawed idea of an economic 'new normal' includes this chart, which is used to argue that the idea of the 'new normal' was also bouncing around in the past 'during some major economic shocks'.

Google NGram for New Normal since 1900 Right, so what's happened here? Clearly a forty year period does not constitute a period of economic shock, and everything other than that massive hill of data does not constitute a meaningful change in usage. If you bother to go to Google Books and search for new normal between 1900 and 1940, all becomes clear. You wade through pages and pages of hits describing 'new normal schools/colleges'. These are, for the most part, references to the Normal Schools (teacher training schools) that sprang up across the States in the late nineteenth and earlier part of the 20th century. Sorry to be a churl, but seriously. If you want to be an economic historian, do your homework properly.

# Alex Steer (14/01/2011)


Predicting the new annoying for 2011

330 words | ~2 min

Remember a little over a year ago, when the mutterings about location-based technology were starting to move into the mainstream? Well, it's happening again, this time with near-field communications.

Looks like NFC is all set to become this year's Thing That Marketers Talk About And Try To Do Until It Gets Annoying.

Print this list off, put it in a drawer, and take it out at the start of 2012.

Did you do that? Good. Now, how many of the following have you seen since the start of 2011?

  1. Shops that broadcast deals to your phone as you walk past
  2. Cereal boxes that launch a URL when you wave your smartphone near them
  3. Blog posts telling you that NFC is a great way to tell 'engaging brand stories' to consumers
  4. Secret parties/gigs run by sports or music brands where you need an NFC enabled phone and a special download to get through the door
  5. NFC-powered group buying/price manipulation gimmicks - turn up, swipe, drive down the price

Right, back to the present.

People in advertising and marketing: please, get a grip before this gets crazy. It's just a swipy phone thing.

If you start off by thinking 'Hmm, how can I use this new technology to push advertising messages?', you will basically be doing it wrong for several reasons.

Take the fact that this new thing exists. Put it in a pocket in the back of your brain. If ever you find yourself with a problem that it could help solve, take it out and dust it down. Make sure it's strategically useful and creatively interesting. If by the time you bring it out it feels like it's been done to death (which, if you're doing this any time after about April 2011, it will), well, think of something else.

For now, though, I'm going to spend the next few weeks waving my phone at walls until something happens.

# Alex Steer (05/01/2011)


Taking TV seriously

1220 words | ~6 min

The Christmas/New Year period inevitably means three things. One, people blog less. Two, the sales start. Three, people start predicting trends.

So I'm going to kick-start this blog again after its holiday slump with a longish post talking about the end-of-year sales and the new-year trends; and in particular about a trend that seems to have a slightly lower profile than I think it deserves.

Believe it or not, the trend is that TV is becoming more important. And it's only really worth talking about it because so many of us have got sidetracked by computers and mobile over the last few years.

Ever since Black Friday (in the US), this year's winter sales story has been about the growth in online retail. This year online spending rose 15% to $36.4bn over the holiday season in the US, well above the overall projected spending rise (3.3%); and the top online retailers saw 42% more traffic than last year. The growth areas for online sales were computer hardware (23%), books and magazines (22%), consumer electronics (21%) and software (20%).

TV feels like the forgotten screen in the consumer electronics category. Read through JWT's excellent 100 Things to Watch for 2011 and you'll notice how much of it is about computing, and particularly mobile computing. There is one mention of TV, in a discussion on video calling. With TV barely featuring in such a comprehensive collection of forecasts, you won't be surprised to know that it's also absent in lots of smaller, more off-the-cuff predictions that do the rounds at this time of year.

Which is odd, as this is set to be a very good year for the big glowing box in the living room. It looks like 2010 saw 17% year-on-year growth in TV sales worldwide. Most of the growth is being driven by emerging markets, especially China, and by new technology going mainstream fast in mature markets. Shipments of LCD TVs grew 31% in 2010, but that adoption curve is already flattening, with 13% growth forecast for 2011.

The new thing for the new year will be internet TV, projected to account for a fifth of new TV shipments. Since this has the word 'internet' in it, there's predictably been lots of chatter about it, what with Google TV, Apple TV and what looks like a contender from Microsoft about to be shown off at CES in a couple of days' time.

Globally, though, the growth will be fuelled by the newly affordable, not just the new. There's a huge emerging market for TV, behind the US/UK technology curve but with a lot of collective spending power. In South Africa, for example, TV ownership by household rose from 53.8% to 65.6% between 2001 and 2007, and is catching up with radio ownership (76.6% in 2007). Of the 3.2 million South African subscribers to the DSTV pay-TV service, over 10% were added in the six months to September 2010, when DSTV introduced its lower-cost Compact package to attract new customers. There are lots of countries where TV has a lot more growth potential than in SA (I just don't have that data - sorry).

So TV is about to become very big in lots of new markets. In well established markets, TV is already big, but it's quietly becoming more important.

A couple of years ago, TV got a brief flurry of attention as part of that whole raft of stories about recession-proof consumer behaviour. Apparently, in late 2008 and early 2009, we were all spending more time at home to save money, so we were buying big new TVs in anticipation of spending less time going out and more time on the sofa.

That TV story came and went along with allotments, lipstick sales, hemlines, and other barometers of recessionistadom. That's a pity, as I think it masked something bigger. We're starting to pay a lot more attention to TV, and to demand more of it.

3D TV remains a gimmick, and sales have been disappointing. This is usually blamed (as in the link) on the lack of worthwhile 3D content, but my hunch is that 3D TV basically misjudges how we watch TV. Unless we're fanatical, most of the time we give TV continuous partial attention. It's one of the reasons we don't mind ad breaks on TV but hate them on streaming video, and why we'd walk out of the cinema if they interrupted the screening to show commercials. 3D TV assumes we give TV our full attention, so are happy to wear big plastic glasses while watching. We don't, so we aren't. We get up, walk round, chat to each other, make tea, play on our phones, and so on.

But 3D TV gets it slightly right even as it gets it wrong. It assumes we now want to give TV sets our full attention. We don't, but we are giving them more than we used to. We don't just want them to be good enough, on in the background unobtrusively, like radios with pictures.

I think TVs are turning into appointment-to-view technology, serving those occasions when we want to sit down and be entertained by one thing for an extended period. Hence the success of HD; of hard-disk recorders; of games consoles like the Wii, Kinect and XBox 360 whose natural home is the living room not the bedroom. 3D glasses are just a step too far in the right direction.

It's lasted too long to be a recessionary fluke. Perhaps it's because we have more, more portable devices (smartphones, tablets, laptops) that fulfil our continuous-partial-attention entertainment needs. We now expect TV to fill the niche that computers and mobile devices don't: something to entertain us, quite immersively, for specific periods of time, when we choose.

This obviously has implications for advertising, beyond the much-discussed impacts of recording and the new potential of internet TV to target ads. If we demand more of TV, we may be less forgiving of interruptive, uninteresting advertising. (Or maybe that's just wishful thinking.)

With the exception of 3D, which I reckon will soon be relegated to gaming and cinema (fully immersive screen entertainment), I'd bet that this will be a good year for the humble telly, and everything that plugs into it.

# Alex Steer (04/01/2011)


Releasing advertising into the wild

888 words | ~4 min

The other day I found myself in Strand Books, an enormous second-hand bookshop in New York. Specifically, on the top floor, which only seems to be accessible by taking the lift; in a tiny narrow passage near the back of the rare books section - quite literally two long ceiling-height bookshelves with a person-sized gap in between, and a sign above that says Books about books.

It's the bibliography section, and I think it's brilliant, so I'll be quite happy if no-one else ever finds it tucked away at the back there. But I can see why people might not share my enthusiasm for books about books, which may seem like the most abstract topic possible.

One of things I've been thinking about, occasionally talking about, over the last couple of weeks is the idea that a piece of communication shouldn't be judged on how much it resembles another piece of communication.

It sounds obvious, but think how little advertising really resembles anything else in popular culture. A linguist would say (and linguists have) that advertising has a remarkably consistent set of discourse features: a bunch of things that mark advertising language out as advertising language. You probably know what I'm talking about. The same is true of visual language. It's the reason all food commercials have a pack shot at the end, and all drinks ads feature someone drinking the product (just in case you've forgotten what you do with a bottle of beer, like Ted Stryker from Airplane!), to say nothing of acting staples like 'instant-relief-from-headache smile' and 'what-will-we-do-with-this-problem-debt? frown'.

Like my 'books about books', these are ads about ads. All their conventions come from advertising. It's a small, incestuous gene pool of features. And a lot of communications make sense because we're trained to recognize what's in them from previous communications.

When you compare advertising to other kinds of cultural production, you realise how different it's become, like the wildlife of Madagascar as it evolved in isolation, or the rednecks from Deliverance.

Marketing messages used to have it easy because they occupied their own space where they were surrounded by their own kind: ad breaks, billboards, predictable bits of newspaper and magazine pages. Now they don't.

Marketing messages now compete, more and more, with other kinds of popular cultural production. They live alongside all sorts of other creative and informative content. Take YouTube channels for example: some are from brands, some are from broadcasters, some are from people, with little to separate them. There are no ad breaks on the internet. In what we call ambient media (stuff you can see in the street, trip over, bump into or be given) the same kind of thing is happening, mainly because of the falling cost and complexity of production. TV and radio are a different story, still dominated by the logic of booking and paying for space, though internet/TV systems may challenge that, as does the growth of the web as a vector for entertainment media - streaming video and music services, that sort of thing - that are eroding TV and radio's share of eyeballs and earlobes.

It's not a breakthrough of analysis to say that a lot of advertising is aggressive and bullying and insistent and shouty, and has lots of other similar bad personality traits. But it's true, and it's amazing that it was ever allowed to get like that.

The fact that here in the US Congress has felt the need to ban advertising being louder than programming on TV shows that a lot of advertising is literally shouty. It's a law that shouldn't have needed to exist. But there it is, on the books, because advertising felt like it had a license to push itself increasingly aggressively on its audiences. That's what happens when you stop thinking about culture as a whole and start making ads about ads, entering into an arms race with other advertisers to see who can make the most ad-like ad: the biggest logo, the shoutiest voiceover, the smallest small print, the longest appearance of the phone number on the screen. That's when you end up with the Cash4Gold ads.

As the idea of 'ad space' falls apart, the space ads occupy is popular culture, and it's crowded there. Ads aren't just compared to other ads, they're compared to every kind of popular cultural production, and the benchmarks are things like interestingness, elegance, utility and entertainment, not the old metrics of consistency, uniqueness or reinforcement. The Old Spice campaign was great because it was funnier than most other things on YouTube; Nike Plus is great because it works better than lots of other run-tracking systems.

Great communications have always been interesting rather than shouty, so this is not new news. It's just that the stakes are higher than they've been for a while, because advertising is less protected as a practice.

It's been released into the wild, where like other elements of popular culture it will find itself having to do useful and interesting things to survive, rather than stand there talking loudly about itself.

# Alex Steer (13/12/2010)


Squeezing out originality on Facebook

369 words | ~2 min

Today Facebook launched a new version of the profile. The 'info' page is concerning.

Here's the kind of thing that's at the top:

Facebook profile screenshot

And a little further down:

Facebook profile screenshot

As you can see, I don't have a lot on my profile. But assuming I want to add a TV programme, I'm invited to pick from a pre-selected autocomplete list, like this:

Facebook profile screenshot

In short, there's almost no room left for anything other than structured data. Back in the late middle ages, when I joined Facebook, putting daft things as your interests became the cause of a kind of strange one-upmanship, as did playing around with the few restrictions imposed by Facebook (like the fact that status updates used to have to start with 'is', which spawned a generation of updates of the kind 'John Smith is the fridge has broken again').

Now we're invited, all of us, to exercise as little lateral thinking as possible, and to make our responses fit into a series of easily-analysed fields. That's because it's easier to target ads when you limit the options to a set of things a computational parser can understand. Computational parsers are very bad at irony and wordplay. Perversely, Facebook's users (whose centre of gravity is still among twenty- and thirtysomethings) are very good at both.

The other thing affliuent, tech-savvy twenty- and thirtysomethings are very good at is punishing brands when they think they're putting their business considerations in the way of good user experience and service.

At the end of a year of grumblings about privacy settings and control over personal data, this is a small, but bad sign. Most of us will forgive these issues in return for a really good experience. But if it people come to feel that the ability to use and customize Facebook has been compromised by the site's willingness to package and sell their data to advertisers, they may turn away unexpectedly fast.

# Alex Steer (06/12/2010)


Permanent brokenness: stray thoughts on wikis and leaks

572 words | ~3 min

Quick one this, late at night.

Fast Company Design has a nice overview of some of the best representations of the leaked US embassy cables from Wikileaks.

Each of these is a neat summary of why the internet changes culture.

These tools are made possible because digitally stored text is greppable. You can search it, rearrange it, perform queries on it and get the results back. Various scripting languages and tools provide a huge range of ways to muck about with text in a way that only used to be possible if you had a pair of scissors and didn't mind being banned from your local library.

In a greppable world, nothing that is made public can be buried in detail. Form and arrangement of texts can be remade on the fly. Tasks like concordancing, which used to take years, can be done in seconds using a script. And, of course, huge stacks of documents can be skimmed for salient details just as fast.

In other words: search doesn't just let you find things in the world. It lets you remake them.

If you know the first thing about computers, what I'm writing is the most obvious thing in the world; and I've talked about it here before, in the context of illusions of online privacy; and I'm not a computer scientist, so I'm hardly well-placed to talk.

If you work in marketing or media, you can talk all you like about how the internet/web/social media/Facebook/Foursquare/whatever changes everything. But if you don't understand this, you don't really understand why the web matters, and you won't be able to roll with its punches.

The web matters to culture because it's the mainstreaming of a large set of information-mucking-about-with tools and techniques that used to be the preserve of programmers, linguists and various kinds of mathematicians whose jobs I don't understand. Those tools, and the things we make with them, have become such a part of our lives that they have brought with them new expectations about culture. Expectations like the idea that we should be able to take things, mash them up, patch them, fix them, make them better and pass them on when they don't do exactly the job we want them to do.

This doesn't mean participation or user-generated content or even customization tweaks. It means the idea that everything is basically permanently broken - that problems need to be solved (and can be solved) on a case-by-case basis, quickly. If you make ads, that's a challenge (and if you make products or services, even more so). Mainly, it feels like it's a matter of approach. Too much fanfare and ta-daa, and you'll look pretty bad when your big idea, big campaign, big product (etc.) is met by a kind of grumpy it'll-do-for-now-ness.

How do you get attention and make yourself useful in a hacking, grepping popular culture?

Can 'release early, release often' work for ads or brands? Can you find room for a big idea in culture by executing it in a series of permanent-beta ways? Can that thinking work in media other than digital without seeming like twee engagement-i-ness? Would it be appropriate to end this post with a bunch of not-quite-answered questions? Or is that too meta?

# Alex Steer (30/11/2010)