Alex Steer

Better communication through data / about / archive

The internet, criticism and the culture wars

735 words | ~4 min

Malcolm Gladwell, mid-way through either stating the obvious or missing the point in a New Yorker piece on social media and activism, writes:

We would say that Mao posted that power comes from the barrel of a gun on his Facebook page, or we would say that he blogged about gun barrels on Tumblr—and eventually, as the apostles of new media wrestled with the implications of his comments, the verb would come to completely overcome the noun, the part about the gun would be forgotten, and the the big takeaway would be: Whoa. Did you see what Mao just tweeted?

To me the most important word in that paragraph is 'whoa'.

'Whoa' is not a word you put in the mouths of others if you want to portray them as serious and independent-minded thinkers. Nor, let's be honest, is 'apostles'. Speaking of long words, rhetoricians (there's one) used to call this tactic 'prosopopoeia' (there's a second) - speaking in the voice of another, to make a point of your own. Gladwell's point here, I think, is that social media users are essentially shallow and vacuous.

While pausing briefly to ponder the vacuities that print is able to bestow upon the world every day without anyone blaming the medium (yes, McLuhan, yes), this derisory 'whoa' speaks volumes about the ongoing vitality of the old debate about 'high' vs 'low' culture.

Only a few days ago, the Observer's Culture section went all 'O Tempora! O Mores!' about the shift in power from professional critics to amateur criticism. Unsurprisingly, social media was roped in for a whacking:

The real threat to cultural authority turns out not to be blogging but social networking... The point isn't that the traditional critics are always wrong and these populists are right, or even that these comments are overwhelmingly negative or invariably take on the critical consensus. More often than not, they aren't and they don't. The point is that authority has migrated from critics to ordinary folks, and there is nothing -– not collusion or singleness of purpose or torrents of publicity - that the traditional critics can do about it. They have seen their monopoly usurped by what amounts to a vast technological word-of-mouth of hundreds of millions of people.

Those familiar with the history of debates on the function of criticism, particularly around the rise of English studies in the late 19th/early 20th century, might feel the icy breath of T.S. Eliot ('breathing the eternal message of vanity, fear, and lust', maybe) and others at this point. If you're not familiar, I recommend Chris Baldick's The Social Mission of English Criticism, or virtually anything by Stephan Collini. This is not a new story - mass participation is the death-knell of criticism, as surely as home taping kills music.

There is no doubting which side of the high/mass culture division the internet falls on. What's doubtful is the continued insistence that mass culture has to be low culture. (If you don't think you're prone to this, go on, ask yourself: which is really better, Classics or media studies? If you have an answer off the top of your head, either way, you're prone.) Look all over the web and you will find thoughtful, sustained engagements with the future of media and culture that make many 'function of criticism' arguments look paltry and narrow. For all the dumb, there is plenty of smart - and, whereas much high cultural aesthetics invites you to pay no attention to the man behind the curtain, the best critical theories of mass culture ask you to understand the smart in the context of the dumb. Not just with the aim of appreciation of the smart (as in many aesthetic formalisms), but with the aim of understanding various cultural systems as wholes (or interlocking sets, or whatever - pick your metaphor), and perhaps even of engaging and (dare I say) improving, of helping the smart outnumber the dumb.

It's enough to make you say 'whoa'. And, thanks to various miracles of technology and economy, it's in all of our hands (quite literally; or pockets). Which means it may be time to take criticism seriously as a discipline. Again.

# Alex Steer (03/02/2011)


Media and revolutions: Microscopes and megaphones

333 words | ~2 min

The current uprising in Egypt has prompted a lot of talk about the role of social media in political protest. The most sensible comments have been to the effect that, no, it's not an enabler, but yes, it can be an accelerant.

What about the relationship between 'social' and 'mainstream' media? Here are my thoughts. Take them to bits and see if they work.

At its best, social media provides a microscope. It plays the same role foreign correspondents traditionally play, giving a close-up view of what's happening. Social media provides many correspondents and allows for the capture and rapid sharing of lots of detail.

At its best, mainstream media provides a megaphone. It takes the important stories and gives them scale.

At their best, together, the microscope feeds the megaphone. Take Egypt. Without the microscope, today's news headline would be Rival factions battle in the streets of Cairo. The microscope shows that the pro-Mubarak faction is a rent-a-mob of thugs and cronies. As a result, the megaphone tells a better story. Or should, at least.

Between the two there has to be a filter. The microscope is notoriously bad when it pretends to be a megaphone. Easy me-too functions like retweeting are great for spreading news, but also great for giving isolated observations an undue sense of scale (which is, after all, what microscopes are for). The megaphone is just as bad as a microscope, given the pressures of time and money which hinder true investigative journalism.

The filter is the journalist - at least, what journalists may be becoming. Not the sole unearther of facts and pursuer of leads, but the skillful observer of everything the microscope reveals; the maker of connections; the person who finds the story, and knows how to set it up for the megaphone. If professional news has a future it may be through this analytical function. Rehashing press releases is, after all, little more than retweeting.

# Alex Steer (02/02/2011)


Getting economic history wrong with Google NGrams

267 words | ~1 min

Listen very carefully, I shall say this only once (or more). Google Ngrams is lots of fun, but it is next to useless for analysing the history of language, or of ideas.

I'm talking to you, New York Times Economics Blog. An otherwise sound blog post on the flawed idea of an economic 'new normal' includes this chart, which is used to argue that the idea of the 'new normal' was also bouncing around in the past 'during some major economic shocks'.

Google NGram for New Normal since 1900 Right, so what's happened here? Clearly a forty year period does not constitute a period of economic shock, and everything other than that massive hill of data does not constitute a meaningful change in usage. If you bother to go to Google Books and search for new normal between 1900 and 1940, all becomes clear. You wade through pages and pages of hits describing 'new normal schools/colleges'. These are, for the most part, references to the Normal Schools (teacher training schools) that sprang up across the States in the late nineteenth and earlier part of the 20th century. Sorry to be a churl, but seriously. If you want to be an economic historian, do your homework properly.

# Alex Steer (14/01/2011)


Predicting the new annoying for 2011

330 words | ~2 min

Remember a little over a year ago, when the mutterings about location-based technology were starting to move into the mainstream? Well, it's happening again, this time with near-field communications.

Looks like NFC is all set to become this year's Thing That Marketers Talk About And Try To Do Until It Gets Annoying.

Print this list off, put it in a drawer, and take it out at the start of 2012.

Did you do that? Good. Now, how many of the following have you seen since the start of 2011?

  1. Shops that broadcast deals to your phone as you walk past
  2. Cereal boxes that launch a URL when you wave your smartphone near them
  3. Blog posts telling you that NFC is a great way to tell 'engaging brand stories' to consumers
  4. Secret parties/gigs run by sports or music brands where you need an NFC enabled phone and a special download to get through the door
  5. NFC-powered group buying/price manipulation gimmicks - turn up, swipe, drive down the price

Right, back to the present.

People in advertising and marketing: please, get a grip before this gets crazy. It's just a swipy phone thing.

If you start off by thinking 'Hmm, how can I use this new technology to push advertising messages?', you will basically be doing it wrong for several reasons.

Take the fact that this new thing exists. Put it in a pocket in the back of your brain. If ever you find yourself with a problem that it could help solve, take it out and dust it down. Make sure it's strategically useful and creatively interesting. If by the time you bring it out it feels like it's been done to death (which, if you're doing this any time after about April 2011, it will), well, think of something else.

For now, though, I'm going to spend the next few weeks waving my phone at walls until something happens.

# Alex Steer (05/01/2011)


Taking TV seriously

1220 words | ~6 min

The Christmas/New Year period inevitably means three things. One, people blog less. Two, the sales start. Three, people start predicting trends.

So I'm going to kick-start this blog again after its holiday slump with a longish post talking about the end-of-year sales and the new-year trends; and in particular about a trend that seems to have a slightly lower profile than I think it deserves.

Believe it or not, the trend is that TV is becoming more important. And it's only really worth talking about it because so many of us have got sidetracked by computers and mobile over the last few years.

Ever since Black Friday (in the US), this year's winter sales story has been about the growth in online retail. This year online spending rose 15% to $36.4bn over the holiday season in the US, well above the overall projected spending rise (3.3%); and the top online retailers saw 42% more traffic than last year. The growth areas for online sales were computer hardware (23%), books and magazines (22%), consumer electronics (21%) and software (20%).

TV feels like the forgotten screen in the consumer electronics category. Read through JWT's excellent 100 Things to Watch for 2011 and you'll notice how much of it is about computing, and particularly mobile computing. There is one mention of TV, in a discussion on video calling. With TV barely featuring in such a comprehensive collection of forecasts, you won't be surprised to know that it's also absent in lots of smaller, more off-the-cuff predictions that do the rounds at this time of year.

Which is odd, as this is set to be a very good year for the big glowing box in the living room. It looks like 2010 saw 17% year-on-year growth in TV sales worldwide. Most of the growth is being driven by emerging markets, especially China, and by new technology going mainstream fast in mature markets. Shipments of LCD TVs grew 31% in 2010, but that adoption curve is already flattening, with 13% growth forecast for 2011.

The new thing for the new year will be internet TV, projected to account for a fifth of new TV shipments. Since this has the word 'internet' in it, there's predictably been lots of chatter about it, what with Google TV, Apple TV and what looks like a contender from Microsoft about to be shown off at CES in a couple of days' time.

Globally, though, the growth will be fuelled by the newly affordable, not just the new. There's a huge emerging market for TV, behind the US/UK technology curve but with a lot of collective spending power. In South Africa, for example, TV ownership by household rose from 53.8% to 65.6% between 2001 and 2007, and is catching up with radio ownership (76.6% in 2007). Of the 3.2 million South African subscribers to the DSTV pay-TV service, over 10% were added in the six months to September 2010, when DSTV introduced its lower-cost Compact package to attract new customers. There are lots of countries where TV has a lot more growth potential than in SA (I just don't have that data - sorry).

So TV is about to become very big in lots of new markets. In well established markets, TV is already big, but it's quietly becoming more important.

A couple of years ago, TV got a brief flurry of attention as part of that whole raft of stories about recession-proof consumer behaviour. Apparently, in late 2008 and early 2009, we were all spending more time at home to save money, so we were buying big new TVs in anticipation of spending less time going out and more time on the sofa.

That TV story came and went along with allotments, lipstick sales, hemlines, and other barometers of recessionistadom. That's a pity, as I think it masked something bigger. We're starting to pay a lot more attention to TV, and to demand more of it.

3D TV remains a gimmick, and sales have been disappointing. This is usually blamed (as in the link) on the lack of worthwhile 3D content, but my hunch is that 3D TV basically misjudges how we watch TV. Unless we're fanatical, most of the time we give TV continuous partial attention. It's one of the reasons we don't mind ad breaks on TV but hate them on streaming video, and why we'd walk out of the cinema if they interrupted the screening to show commercials. 3D TV assumes we give TV our full attention, so are happy to wear big plastic glasses while watching. We don't, so we aren't. We get up, walk round, chat to each other, make tea, play on our phones, and so on.

But 3D TV gets it slightly right even as it gets it wrong. It assumes we now want to give TV sets our full attention. We don't, but we are giving them more than we used to. We don't just want them to be good enough, on in the background unobtrusively, like radios with pictures.

I think TVs are turning into appointment-to-view technology, serving those occasions when we want to sit down and be entertained by one thing for an extended period. Hence the success of HD; of hard-disk recorders; of games consoles like the Wii, Kinect and XBox 360 whose natural home is the living room not the bedroom. 3D glasses are just a step too far in the right direction.

It's lasted too long to be a recessionary fluke. Perhaps it's because we have more, more portable devices (smartphones, tablets, laptops) that fulfil our continuous-partial-attention entertainment needs. We now expect TV to fill the niche that computers and mobile devices don't: something to entertain us, quite immersively, for specific periods of time, when we choose.

This obviously has implications for advertising, beyond the much-discussed impacts of recording and the new potential of internet TV to target ads. If we demand more of TV, we may be less forgiving of interruptive, uninteresting advertising. (Or maybe that's just wishful thinking.)

With the exception of 3D, which I reckon will soon be relegated to gaming and cinema (fully immersive screen entertainment), I'd bet that this will be a good year for the humble telly, and everything that plugs into it.

# Alex Steer (04/01/2011)


Releasing advertising into the wild

888 words | ~4 min

The other day I found myself in Strand Books, an enormous second-hand bookshop in New York. Specifically, on the top floor, which only seems to be accessible by taking the lift; in a tiny narrow passage near the back of the rare books section - quite literally two long ceiling-height bookshelves with a person-sized gap in between, and a sign above that says Books about books.

It's the bibliography section, and I think it's brilliant, so I'll be quite happy if no-one else ever finds it tucked away at the back there. But I can see why people might not share my enthusiasm for books about books, which may seem like the most abstract topic possible.

One of things I've been thinking about, occasionally talking about, over the last couple of weeks is the idea that a piece of communication shouldn't be judged on how much it resembles another piece of communication.

It sounds obvious, but think how little advertising really resembles anything else in popular culture. A linguist would say (and linguists have) that advertising has a remarkably consistent set of discourse features: a bunch of things that mark advertising language out as advertising language. You probably know what I'm talking about. The same is true of visual language. It's the reason all food commercials have a pack shot at the end, and all drinks ads feature someone drinking the product (just in case you've forgotten what you do with a bottle of beer, like Ted Stryker from Airplane!), to say nothing of acting staples like 'instant-relief-from-headache smile' and 'what-will-we-do-with-this-problem-debt? frown'.

Like my 'books about books', these are ads about ads. All their conventions come from advertising. It's a small, incestuous gene pool of features. And a lot of communications make sense because we're trained to recognize what's in them from previous communications.

When you compare advertising to other kinds of cultural production, you realise how different it's become, like the wildlife of Madagascar as it evolved in isolation, or the rednecks from Deliverance.

Marketing messages used to have it easy because they occupied their own space where they were surrounded by their own kind: ad breaks, billboards, predictable bits of newspaper and magazine pages. Now they don't.

Marketing messages now compete, more and more, with other kinds of popular cultural production. They live alongside all sorts of other creative and informative content. Take YouTube channels for example: some are from brands, some are from broadcasters, some are from people, with little to separate them. There are no ad breaks on the internet. In what we call ambient media (stuff you can see in the street, trip over, bump into or be given) the same kind of thing is happening, mainly because of the falling cost and complexity of production. TV and radio are a different story, still dominated by the logic of booking and paying for space, though internet/TV systems may challenge that, as does the growth of the web as a vector for entertainment media - streaming video and music services, that sort of thing - that are eroding TV and radio's share of eyeballs and earlobes.

It's not a breakthrough of analysis to say that a lot of advertising is aggressive and bullying and insistent and shouty, and has lots of other similar bad personality traits. But it's true, and it's amazing that it was ever allowed to get like that.

The fact that here in the US Congress has felt the need to ban advertising being louder than programming on TV shows that a lot of advertising is literally shouty. It's a law that shouldn't have needed to exist. But there it is, on the books, because advertising felt like it had a license to push itself increasingly aggressively on its audiences. That's what happens when you stop thinking about culture as a whole and start making ads about ads, entering into an arms race with other advertisers to see who can make the most ad-like ad: the biggest logo, the shoutiest voiceover, the smallest small print, the longest appearance of the phone number on the screen. That's when you end up with the Cash4Gold ads.

As the idea of 'ad space' falls apart, the space ads occupy is popular culture, and it's crowded there. Ads aren't just compared to other ads, they're compared to every kind of popular cultural production, and the benchmarks are things like interestingness, elegance, utility and entertainment, not the old metrics of consistency, uniqueness or reinforcement. The Old Spice campaign was great because it was funnier than most other things on YouTube; Nike Plus is great because it works better than lots of other run-tracking systems.

Great communications have always been interesting rather than shouty, so this is not new news. It's just that the stakes are higher than they've been for a while, because advertising is less protected as a practice.

It's been released into the wild, where like other elements of popular culture it will find itself having to do useful and interesting things to survive, rather than stand there talking loudly about itself.

# Alex Steer (13/12/2010)


Squeezing out originality on Facebook

369 words | ~2 min

Today Facebook launched a new version of the profile. The 'info' page is concerning.

Here's the kind of thing that's at the top:

Facebook profile screenshot

And a little further down:

Facebook profile screenshot

As you can see, I don't have a lot on my profile. But assuming I want to add a TV programme, I'm invited to pick from a pre-selected autocomplete list, like this:

Facebook profile screenshot

In short, there's almost no room left for anything other than structured data. Back in the late middle ages, when I joined Facebook, putting daft things as your interests became the cause of a kind of strange one-upmanship, as did playing around with the few restrictions imposed by Facebook (like the fact that status updates used to have to start with 'is', which spawned a generation of updates of the kind 'John Smith is the fridge has broken again').

Now we're invited, all of us, to exercise as little lateral thinking as possible, and to make our responses fit into a series of easily-analysed fields. That's because it's easier to target ads when you limit the options to a set of things a computational parser can understand. Computational parsers are very bad at irony and wordplay. Perversely, Facebook's users (whose centre of gravity is still among twenty- and thirtysomethings) are very good at both.

The other thing affliuent, tech-savvy twenty- and thirtysomethings are very good at is punishing brands when they think they're putting their business considerations in the way of good user experience and service.

At the end of a year of grumblings about privacy settings and control over personal data, this is a small, but bad sign. Most of us will forgive these issues in return for a really good experience. But if it people come to feel that the ability to use and customize Facebook has been compromised by the site's willingness to package and sell their data to advertisers, they may turn away unexpectedly fast.

# Alex Steer (06/12/2010)


Permanent brokenness: stray thoughts on wikis and leaks

572 words | ~3 min

Quick one this, late at night.

Fast Company Design has a nice overview of some of the best representations of the leaked US embassy cables from Wikileaks.

Each of these is a neat summary of why the internet changes culture.

These tools are made possible because digitally stored text is greppable. You can search it, rearrange it, perform queries on it and get the results back. Various scripting languages and tools provide a huge range of ways to muck about with text in a way that only used to be possible if you had a pair of scissors and didn't mind being banned from your local library.

In a greppable world, nothing that is made public can be buried in detail. Form and arrangement of texts can be remade on the fly. Tasks like concordancing, which used to take years, can be done in seconds using a script. And, of course, huge stacks of documents can be skimmed for salient details just as fast.

In other words: search doesn't just let you find things in the world. It lets you remake them.

If you know the first thing about computers, what I'm writing is the most obvious thing in the world; and I've talked about it here before, in the context of illusions of online privacy; and I'm not a computer scientist, so I'm hardly well-placed to talk.

If you work in marketing or media, you can talk all you like about how the internet/web/social media/Facebook/Foursquare/whatever changes everything. But if you don't understand this, you don't really understand why the web matters, and you won't be able to roll with its punches.

The web matters to culture because it's the mainstreaming of a large set of information-mucking-about-with tools and techniques that used to be the preserve of programmers, linguists and various kinds of mathematicians whose jobs I don't understand. Those tools, and the things we make with them, have become such a part of our lives that they have brought with them new expectations about culture. Expectations like the idea that we should be able to take things, mash them up, patch them, fix them, make them better and pass them on when they don't do exactly the job we want them to do.

This doesn't mean participation or user-generated content or even customization tweaks. It means the idea that everything is basically permanently broken - that problems need to be solved (and can be solved) on a case-by-case basis, quickly. If you make ads, that's a challenge (and if you make products or services, even more so). Mainly, it feels like it's a matter of approach. Too much fanfare and ta-daa, and you'll look pretty bad when your big idea, big campaign, big product (etc.) is met by a kind of grumpy it'll-do-for-now-ness.

How do you get attention and make yourself useful in a hacking, grepping popular culture?

Can 'release early, release often' work for ads or brands? Can you find room for a big idea in culture by executing it in a series of permanent-beta ways? Can that thinking work in media other than digital without seeming like twee engagement-i-ness? Would it be appropriate to end this post with a bunch of not-quite-answered questions? Or is that too meta?

# Alex Steer (30/11/2010)


Long, thin advertising

722 words | ~4 min

I realize I don't talk about ads a lot here. And I'm sort of not doing it now, because this campaign is a deliberate piece of scam work for Swedish energy company Fortum, which won a Future Lion for a design student. It's a sort of advertising what-if.

Two-sentence version: Fortum doesn't want to be a boring energy company so it opens a gym. The exercise machines have generators in so users can export to the grid, and track and share their progress (and others') via various social media gubbins.

You'll notice it seems to be illegal now to make an ad campaign without making a video about how you made the ad campaign, and ideally staying on the right side of the law involves lots of swooshing social network logos overscored by chat about how you executed your creative idea across four hundred different channels, all while riding a horse and cooking eggs. But actually, that's not what's really interesting about this charmingly left-field idea.

It's interesting because of scale.

Last year all the big Cannes Lion winners were activations, some of which were turned into ads. Everyone's take-out was that these campaigns were so striking because of the smallness in space and time of the original activations. Tropicana's 'Sun' originally reached one small village in the arctic circle; Playground's 'Sleepless' was done in one hiking store in Stockholm; Bosch 'Stone Age Meat' was one supermarket on a handful of days. They were all driven above the line to become huge campaigns that lots of people saw, inviting the bigger and more important secondary audience to share in the joy of the moment remotely. But they started as very local, and very time-limited, blink-and-you-miss-it things in the real world.

The Fortum idea isn't for an activation, it's for a business. Small and local in space, but (hopefully) lasting in time. In that sense I don't think of it as an activation, or even a campaign in any meaningful sense. It's a permanent contribution. The difference between doing an activation and running a gym is qualitative.

Ads are short and fat. They're everywhere for a short period of time, pushing a message repeatedly. The things-that-aren't-ads that I'm talking about are long and thin. They're not huge noisy things but they do last, and they're useful, and they're branded. If they lived behind a screen, I'd call them apps, I suppose.

If you like that sort of thing, it's a bit of a challenge to campaign planning. It has to become much more a part of business planning. Running a gym is a serious longitudinal commitment, and a risk. If you stop bothering once you're bored with it, it'll fall apart and people will dislike you for it. A couple of years ago a lot of campaign Twitter feeds fell victim to that kind of campaign-level short-term thinking: a few weeks of feverish updating, then a long sad silence. Long, thin advertising means planning for the long term.

It also upsets the relationship between brands and products/services. Put over-simply, brand positionings are often abstractions of product/service. Nike is all about improving sports performance because it makes sports equipment. IBM is about making the planet smarter because it makes industrial computing systems. BP... never mind.

But in a long/thin world where you make apps not ads, the things you build and run are your communications, and they have no campaign timeline. Naomi Klein was wrong ten years ago when she suggested in No Logo that brands were becoming detached from products and services. The attachment's still there, it just inverts. You make stuff based on who you are, rather than deciding who you are based on the things you happen to make - which seems a far more sensible arrangement to me. On that basis, brand planning belongs as an innovation and business development function as much as in marketing and communications.

Brand planning often begins with ideas of the 'we're a frozen food company and we want to talk about healthy eating' variety. How much more interesting things would get you began, 'We're a bunch of health freaks. What shall we do to help other people be healthy? By the way, we own a lot of freezers.'

# Alex Steer (18/11/2010)


A wave of the hand

696 words | ~3 min

Watching Mark Zuckerberg announce the changes to Facebook messages, I noticed this, as TechCrunch reports it:

Zuckerberg recalled talking to high schoolers recently and asking them how they communicate with one another. They don't like email. "It's too formal."

And here's the edit from the Telegraph:

Email is too slow... email is too formal. There is too much friction, like the filling in the subject line…... when people send an email.

I've heard the whole email-is-dead thing before, but the reason for the distaste for email was made unusually clear here. (Perhaps too clear - it came up so many times it began to feel forced, like it wasn't quite justification enough for the shiny new post-email messaging system, but anyway).

The email subject line was made to sound like nineteenth-century penmanship or table manners, like case agreement in Greek, like the structural devices that make a sonnet a sonnet. A friction on the free movement of thoughts and ideas from person to person.

I get the same feeling of an idea when I think about Kinect, whose point of differentiation seems to be the idea that the Wii is great, but having to have a controller in your hand is basically like having to build the console yourself. Or when I think about contactless payment technology, especially when it's embedded in phones like payWave; or what Google Voice Search will no doubt become.

It's the idea that the best tools are the ones that provide immediacy, in the broad sense of putting nothing between you and your object; that provide agency without the bother of mechanism. They promise a world without effort, without delay and without complexity; everything available and amenable immediately. A wave of the hand, and thought becomes action. In other words, magic.

But magic requires surprise. Be honest, does paying for something with your phone, or sending an instant message, or searching by speaking, feel like it's magic? Novel maybe, but not magic. When everyone's got magic powers, being a magician is, I imagine, rubbish.

All of which will push more and more magical innovation which will become unmagical immediately. Apple even used the word 'magical' to describe the iPad when it launched. It sounded stupid at the time, more so now you see them on trains and buses. Yesterday, Mary Meeker gave a presentation to the Web 2.0 summit in which she marvelled at the innovation rate of massive companies like Apple and Google. But in the humdrum magic-making industry that kind of innovation rate is required to survive.

Meanwhile, with ten iPads on every bus, buses become more magical than iPads. Rare, extraordinary, unfathomably complex to most of us, things-that-obviously-have-inner-workings are oddly fascinating because they buck the trend of our expectations. How else to explain the rise and resurgence of Etsy, allotment gardening, Newspaper Club, home electronics, Shop Class As Soulcraft, the Google Chrome TV ads, and that odd BBC show in which Giles Coren and Sue Perkins re-enact The Good Life?

The novelty value of magical tools may not go away any time soon, and the innovation rate may not slacken, but it feels like the capacity to genuinely fascinate - in the way that HTML used to fascinate geeky kids fifteen years ago, or DIY computers fifteen years before that - is dropping. Magic's expected, so now mechanical is the new magical.

So if you're thinking of knocking out a smartphone app as part of your digital marketing push, wonder if it won't just get lost in the tide of magic. Even when you use a smartphone app, you're basically just watching a magic show. See if you can make something that people will want to muck around with instead.

# Alex Steer (17/11/2010)