Alex Steer

Better communication through data / about / archive

Predicting the new annoying for 2011

330 words | ~2 min

Remember a little over a year ago, when the mutterings about location-based technology were starting to move into the mainstream? Well, it's happening again, this time with near-field communications.

Looks like NFC is all set to become this year's Thing That Marketers Talk About And Try To Do Until It Gets Annoying.

Print this list off, put it in a drawer, and take it out at the start of 2012.

Did you do that? Good. Now, how many of the following have you seen since the start of 2011?

  1. Shops that broadcast deals to your phone as you walk past
  2. Cereal boxes that launch a URL when you wave your smartphone near them
  3. Blog posts telling you that NFC is a great way to tell 'engaging brand stories' to consumers
  4. Secret parties/gigs run by sports or music brands where you need an NFC enabled phone and a special download to get through the door
  5. NFC-powered group buying/price manipulation gimmicks - turn up, swipe, drive down the price

Right, back to the present.

People in advertising and marketing: please, get a grip before this gets crazy. It's just a swipy phone thing.

If you start off by thinking 'Hmm, how can I use this new technology to push advertising messages?', you will basically be doing it wrong for several reasons.

Take the fact that this new thing exists. Put it in a pocket in the back of your brain. If ever you find yourself with a problem that it could help solve, take it out and dust it down. Make sure it's strategically useful and creatively interesting. If by the time you bring it out it feels like it's been done to death (which, if you're doing this any time after about April 2011, it will), well, think of something else.

For now, though, I'm going to spend the next few weeks waving my phone at walls until something happens.

# Alex Steer (05/01/2011)


Taking TV seriously

1220 words | ~6 min

The Christmas/New Year period inevitably means three things. One, people blog less. Two, the sales start. Three, people start predicting trends.

So I'm going to kick-start this blog again after its holiday slump with a longish post talking about the end-of-year sales and the new-year trends; and in particular about a trend that seems to have a slightly lower profile than I think it deserves.

Believe it or not, the trend is that TV is becoming more important. And it's only really worth talking about it because so many of us have got sidetracked by computers and mobile over the last few years.

Ever since Black Friday (in the US), this year's winter sales story has been about the growth in online retail. This year online spending rose 15% to $36.4bn over the holiday season in the US, well above the overall projected spending rise (3.3%); and the top online retailers saw 42% more traffic than last year. The growth areas for online sales were computer hardware (23%), books and magazines (22%), consumer electronics (21%) and software (20%).

TV feels like the forgotten screen in the consumer electronics category. Read through JWT's excellent 100 Things to Watch for 2011 and you'll notice how much of it is about computing, and particularly mobile computing. There is one mention of TV, in a discussion on video calling. With TV barely featuring in such a comprehensive collection of forecasts, you won't be surprised to know that it's also absent in lots of smaller, more off-the-cuff predictions that do the rounds at this time of year.

Which is odd, as this is set to be a very good year for the big glowing box in the living room. It looks like 2010 saw 17% year-on-year growth in TV sales worldwide. Most of the growth is being driven by emerging markets, especially China, and by new technology going mainstream fast in mature markets. Shipments of LCD TVs grew 31% in 2010, but that adoption curve is already flattening, with 13% growth forecast for 2011.

The new thing for the new year will be internet TV, projected to account for a fifth of new TV shipments. Since this has the word 'internet' in it, there's predictably been lots of chatter about it, what with Google TV, Apple TV and what looks like a contender from Microsoft about to be shown off at CES in a couple of days' time.

Globally, though, the growth will be fuelled by the newly affordable, not just the new. There's a huge emerging market for TV, behind the US/UK technology curve but with a lot of collective spending power. In South Africa, for example, TV ownership by household rose from 53.8% to 65.6% between 2001 and 2007, and is catching up with radio ownership (76.6% in 2007). Of the 3.2 million South African subscribers to the DSTV pay-TV service, over 10% were added in the six months to September 2010, when DSTV introduced its lower-cost Compact package to attract new customers. There are lots of countries where TV has a lot more growth potential than in SA (I just don't have that data - sorry).

So TV is about to become very big in lots of new markets. In well established markets, TV is already big, but it's quietly becoming more important.

A couple of years ago, TV got a brief flurry of attention as part of that whole raft of stories about recession-proof consumer behaviour. Apparently, in late 2008 and early 2009, we were all spending more time at home to save money, so we were buying big new TVs in anticipation of spending less time going out and more time on the sofa.

That TV story came and went along with allotments, lipstick sales, hemlines, and other barometers of recessionistadom. That's a pity, as I think it masked something bigger. We're starting to pay a lot more attention to TV, and to demand more of it.

3D TV remains a gimmick, and sales have been disappointing. This is usually blamed (as in the link) on the lack of worthwhile 3D content, but my hunch is that 3D TV basically misjudges how we watch TV. Unless we're fanatical, most of the time we give TV continuous partial attention. It's one of the reasons we don't mind ad breaks on TV but hate them on streaming video, and why we'd walk out of the cinema if they interrupted the screening to show commercials. 3D TV assumes we give TV our full attention, so are happy to wear big plastic glasses while watching. We don't, so we aren't. We get up, walk round, chat to each other, make tea, play on our phones, and so on.

But 3D TV gets it slightly right even as it gets it wrong. It assumes we now want to give TV sets our full attention. We don't, but we are giving them more than we used to. We don't just want them to be good enough, on in the background unobtrusively, like radios with pictures.

I think TVs are turning into appointment-to-view technology, serving those occasions when we want to sit down and be entertained by one thing for an extended period. Hence the success of HD; of hard-disk recorders; of games consoles like the Wii, Kinect and XBox 360 whose natural home is the living room not the bedroom. 3D glasses are just a step too far in the right direction.

It's lasted too long to be a recessionary fluke. Perhaps it's because we have more, more portable devices (smartphones, tablets, laptops) that fulfil our continuous-partial-attention entertainment needs. We now expect TV to fill the niche that computers and mobile devices don't: something to entertain us, quite immersively, for specific periods of time, when we choose.

This obviously has implications for advertising, beyond the much-discussed impacts of recording and the new potential of internet TV to target ads. If we demand more of TV, we may be less forgiving of interruptive, uninteresting advertising. (Or maybe that's just wishful thinking.)

With the exception of 3D, which I reckon will soon be relegated to gaming and cinema (fully immersive screen entertainment), I'd bet that this will be a good year for the humble telly, and everything that plugs into it.

# Alex Steer (04/01/2011)


Releasing advertising into the wild

888 words | ~4 min

The other day I found myself in Strand Books, an enormous second-hand bookshop in New York. Specifically, on the top floor, which only seems to be accessible by taking the lift; in a tiny narrow passage near the back of the rare books section - quite literally two long ceiling-height bookshelves with a person-sized gap in between, and a sign above that says Books about books.

It's the bibliography section, and I think it's brilliant, so I'll be quite happy if no-one else ever finds it tucked away at the back there. But I can see why people might not share my enthusiasm for books about books, which may seem like the most abstract topic possible.

One of things I've been thinking about, occasionally talking about, over the last couple of weeks is the idea that a piece of communication shouldn't be judged on how much it resembles another piece of communication.

It sounds obvious, but think how little advertising really resembles anything else in popular culture. A linguist would say (and linguists have) that advertising has a remarkably consistent set of discourse features: a bunch of things that mark advertising language out as advertising language. You probably know what I'm talking about. The same is true of visual language. It's the reason all food commercials have a pack shot at the end, and all drinks ads feature someone drinking the product (just in case you've forgotten what you do with a bottle of beer, like Ted Stryker from Airplane!), to say nothing of acting staples like 'instant-relief-from-headache smile' and 'what-will-we-do-with-this-problem-debt? frown'.

Like my 'books about books', these are ads about ads. All their conventions come from advertising. It's a small, incestuous gene pool of features. And a lot of communications make sense because we're trained to recognize what's in them from previous communications.

When you compare advertising to other kinds of cultural production, you realise how different it's become, like the wildlife of Madagascar as it evolved in isolation, or the rednecks from Deliverance.

Marketing messages used to have it easy because they occupied their own space where they were surrounded by their own kind: ad breaks, billboards, predictable bits of newspaper and magazine pages. Now they don't.

Marketing messages now compete, more and more, with other kinds of popular cultural production. They live alongside all sorts of other creative and informative content. Take YouTube channels for example: some are from brands, some are from broadcasters, some are from people, with little to separate them. There are no ad breaks on the internet. In what we call ambient media (stuff you can see in the street, trip over, bump into or be given) the same kind of thing is happening, mainly because of the falling cost and complexity of production. TV and radio are a different story, still dominated by the logic of booking and paying for space, though internet/TV systems may challenge that, as does the growth of the web as a vector for entertainment media - streaming video and music services, that sort of thing - that are eroding TV and radio's share of eyeballs and earlobes.

It's not a breakthrough of analysis to say that a lot of advertising is aggressive and bullying and insistent and shouty, and has lots of other similar bad personality traits. But it's true, and it's amazing that it was ever allowed to get like that.

The fact that here in the US Congress has felt the need to ban advertising being louder than programming on TV shows that a lot of advertising is literally shouty. It's a law that shouldn't have needed to exist. But there it is, on the books, because advertising felt like it had a license to push itself increasingly aggressively on its audiences. That's what happens when you stop thinking about culture as a whole and start making ads about ads, entering into an arms race with other advertisers to see who can make the most ad-like ad: the biggest logo, the shoutiest voiceover, the smallest small print, the longest appearance of the phone number on the screen. That's when you end up with the Cash4Gold ads.

As the idea of 'ad space' falls apart, the space ads occupy is popular culture, and it's crowded there. Ads aren't just compared to other ads, they're compared to every kind of popular cultural production, and the benchmarks are things like interestingness, elegance, utility and entertainment, not the old metrics of consistency, uniqueness or reinforcement. The Old Spice campaign was great because it was funnier than most other things on YouTube; Nike Plus is great because it works better than lots of other run-tracking systems.

Great communications have always been interesting rather than shouty, so this is not new news. It's just that the stakes are higher than they've been for a while, because advertising is less protected as a practice.

It's been released into the wild, where like other elements of popular culture it will find itself having to do useful and interesting things to survive, rather than stand there talking loudly about itself.

# Alex Steer (13/12/2010)


Squeezing out originality on Facebook

369 words | ~2 min

Today Facebook launched a new version of the profile. The 'info' page is concerning.

Here's the kind of thing that's at the top:

Facebook profile screenshot

And a little further down:

Facebook profile screenshot

As you can see, I don't have a lot on my profile. But assuming I want to add a TV programme, I'm invited to pick from a pre-selected autocomplete list, like this:

Facebook profile screenshot

In short, there's almost no room left for anything other than structured data. Back in the late middle ages, when I joined Facebook, putting daft things as your interests became the cause of a kind of strange one-upmanship, as did playing around with the few restrictions imposed by Facebook (like the fact that status updates used to have to start with 'is', which spawned a generation of updates of the kind 'John Smith is the fridge has broken again').

Now we're invited, all of us, to exercise as little lateral thinking as possible, and to make our responses fit into a series of easily-analysed fields. That's because it's easier to target ads when you limit the options to a set of things a computational parser can understand. Computational parsers are very bad at irony and wordplay. Perversely, Facebook's users (whose centre of gravity is still among twenty- and thirtysomethings) are very good at both.

The other thing affliuent, tech-savvy twenty- and thirtysomethings are very good at is punishing brands when they think they're putting their business considerations in the way of good user experience and service.

At the end of a year of grumblings about privacy settings and control over personal data, this is a small, but bad sign. Most of us will forgive these issues in return for a really good experience. But if it people come to feel that the ability to use and customize Facebook has been compromised by the site's willingness to package and sell their data to advertisers, they may turn away unexpectedly fast.

# Alex Steer (06/12/2010)


Permanent brokenness: stray thoughts on wikis and leaks

572 words | ~3 min

Quick one this, late at night.

Fast Company Design has a nice overview of some of the best representations of the leaked US embassy cables from Wikileaks.

Each of these is a neat summary of why the internet changes culture.

These tools are made possible because digitally stored text is greppable. You can search it, rearrange it, perform queries on it and get the results back. Various scripting languages and tools provide a huge range of ways to muck about with text in a way that only used to be possible if you had a pair of scissors and didn't mind being banned from your local library.

In a greppable world, nothing that is made public can be buried in detail. Form and arrangement of texts can be remade on the fly. Tasks like concordancing, which used to take years, can be done in seconds using a script. And, of course, huge stacks of documents can be skimmed for salient details just as fast.

In other words: search doesn't just let you find things in the world. It lets you remake them.

If you know the first thing about computers, what I'm writing is the most obvious thing in the world; and I've talked about it here before, in the context of illusions of online privacy; and I'm not a computer scientist, so I'm hardly well-placed to talk.

If you work in marketing or media, you can talk all you like about how the internet/web/social media/Facebook/Foursquare/whatever changes everything. But if you don't understand this, you don't really understand why the web matters, and you won't be able to roll with its punches.

The web matters to culture because it's the mainstreaming of a large set of information-mucking-about-with tools and techniques that used to be the preserve of programmers, linguists and various kinds of mathematicians whose jobs I don't understand. Those tools, and the things we make with them, have become such a part of our lives that they have brought with them new expectations about culture. Expectations like the idea that we should be able to take things, mash them up, patch them, fix them, make them better and pass them on when they don't do exactly the job we want them to do.

This doesn't mean participation or user-generated content or even customization tweaks. It means the idea that everything is basically permanently broken - that problems need to be solved (and can be solved) on a case-by-case basis, quickly. If you make ads, that's a challenge (and if you make products or services, even more so). Mainly, it feels like it's a matter of approach. Too much fanfare and ta-daa, and you'll look pretty bad when your big idea, big campaign, big product (etc.) is met by a kind of grumpy it'll-do-for-now-ness.

How do you get attention and make yourself useful in a hacking, grepping popular culture?

Can 'release early, release often' work for ads or brands? Can you find room for a big idea in culture by executing it in a series of permanent-beta ways? Can that thinking work in media other than digital without seeming like twee engagement-i-ness? Would it be appropriate to end this post with a bunch of not-quite-answered questions? Or is that too meta?

# Alex Steer (30/11/2010)


Long, thin advertising

722 words | ~4 min

I realize I don't talk about ads a lot here. And I'm sort of not doing it now, because this campaign is a deliberate piece of scam work for Swedish energy company Fortum, which won a Future Lion for a design student. It's a sort of advertising what-if.

Two-sentence version: Fortum doesn't want to be a boring energy company so it opens a gym. The exercise machines have generators in so users can export to the grid, and track and share their progress (and others') via various social media gubbins.

You'll notice it seems to be illegal now to make an ad campaign without making a video about how you made the ad campaign, and ideally staying on the right side of the law involves lots of swooshing social network logos overscored by chat about how you executed your creative idea across four hundred different channels, all while riding a horse and cooking eggs. But actually, that's not what's really interesting about this charmingly left-field idea.

It's interesting because of scale.

Last year all the big Cannes Lion winners were activations, some of which were turned into ads. Everyone's take-out was that these campaigns were so striking because of the smallness in space and time of the original activations. Tropicana's 'Sun' originally reached one small village in the arctic circle; Playground's 'Sleepless' was done in one hiking store in Stockholm; Bosch 'Stone Age Meat' was one supermarket on a handful of days. They were all driven above the line to become huge campaigns that lots of people saw, inviting the bigger and more important secondary audience to share in the joy of the moment remotely. But they started as very local, and very time-limited, blink-and-you-miss-it things in the real world.

The Fortum idea isn't for an activation, it's for a business. Small and local in space, but (hopefully) lasting in time. In that sense I don't think of it as an activation, or even a campaign in any meaningful sense. It's a permanent contribution. The difference between doing an activation and running a gym is qualitative.

Ads are short and fat. They're everywhere for a short period of time, pushing a message repeatedly. The things-that-aren't-ads that I'm talking about are long and thin. They're not huge noisy things but they do last, and they're useful, and they're branded. If they lived behind a screen, I'd call them apps, I suppose.

If you like that sort of thing, it's a bit of a challenge to campaign planning. It has to become much more a part of business planning. Running a gym is a serious longitudinal commitment, and a risk. If you stop bothering once you're bored with it, it'll fall apart and people will dislike you for it. A couple of years ago a lot of campaign Twitter feeds fell victim to that kind of campaign-level short-term thinking: a few weeks of feverish updating, then a long sad silence. Long, thin advertising means planning for the long term.

It also upsets the relationship between brands and products/services. Put over-simply, brand positionings are often abstractions of product/service. Nike is all about improving sports performance because it makes sports equipment. IBM is about making the planet smarter because it makes industrial computing systems. BP... never mind.

But in a long/thin world where you make apps not ads, the things you build and run are your communications, and they have no campaign timeline. Naomi Klein was wrong ten years ago when she suggested in No Logo that brands were becoming detached from products and services. The attachment's still there, it just inverts. You make stuff based on who you are, rather than deciding who you are based on the things you happen to make - which seems a far more sensible arrangement to me. On that basis, brand planning belongs as an innovation and business development function as much as in marketing and communications.

Brand planning often begins with ideas of the 'we're a frozen food company and we want to talk about healthy eating' variety. How much more interesting things would get you began, 'We're a bunch of health freaks. What shall we do to help other people be healthy? By the way, we own a lot of freezers.'

# Alex Steer (18/11/2010)


A wave of the hand

696 words | ~3 min

Watching Mark Zuckerberg announce the changes to Facebook messages, I noticed this, as TechCrunch reports it:

Zuckerberg recalled talking to high schoolers recently and asking them how they communicate with one another. They don't like email. "It's too formal."

And here's the edit from the Telegraph:

Email is too slow... email is too formal. There is too much friction, like the filling in the subject lineĀ…... when people send an email.

I've heard the whole email-is-dead thing before, but the reason for the distaste for email was made unusually clear here. (Perhaps too clear - it came up so many times it began to feel forced, like it wasn't quite justification enough for the shiny new post-email messaging system, but anyway).

The email subject line was made to sound like nineteenth-century penmanship or table manners, like case agreement in Greek, like the structural devices that make a sonnet a sonnet. A friction on the free movement of thoughts and ideas from person to person.

I get the same feeling of an idea when I think about Kinect, whose point of differentiation seems to be the idea that the Wii is great, but having to have a controller in your hand is basically like having to build the console yourself. Or when I think about contactless payment technology, especially when it's embedded in phones like payWave; or what Google Voice Search will no doubt become.

It's the idea that the best tools are the ones that provide immediacy, in the broad sense of putting nothing between you and your object; that provide agency without the bother of mechanism. They promise a world without effort, without delay and without complexity; everything available and amenable immediately. A wave of the hand, and thought becomes action. In other words, magic.

But magic requires surprise. Be honest, does paying for something with your phone, or sending an instant message, or searching by speaking, feel like it's magic? Novel maybe, but not magic. When everyone's got magic powers, being a magician is, I imagine, rubbish.

All of which will push more and more magical innovation which will become unmagical immediately. Apple even used the word 'magical' to describe the iPad when it launched. It sounded stupid at the time, more so now you see them on trains and buses. Yesterday, Mary Meeker gave a presentation to the Web 2.0 summit in which she marvelled at the innovation rate of massive companies like Apple and Google. But in the humdrum magic-making industry that kind of innovation rate is required to survive.

Meanwhile, with ten iPads on every bus, buses become more magical than iPads. Rare, extraordinary, unfathomably complex to most of us, things-that-obviously-have-inner-workings are oddly fascinating because they buck the trend of our expectations. How else to explain the rise and resurgence of Etsy, allotment gardening, Newspaper Club, home electronics, Shop Class As Soulcraft, the Google Chrome TV ads, and that odd BBC show in which Giles Coren and Sue Perkins re-enact The Good Life?

The novelty value of magical tools may not go away any time soon, and the innovation rate may not slacken, but it feels like the capacity to genuinely fascinate - in the way that HTML used to fascinate geeky kids fifteen years ago, or DIY computers fifteen years before that - is dropping. Magic's expected, so now mechanical is the new magical.

So if you're thinking of knocking out a smartphone app as part of your digital marketing push, wonder if it won't just get lost in the tide of magic. Even when you use a smartphone app, you're basically just watching a magic show. See if you can make something that people will want to muck around with instead.

# Alex Steer (17/11/2010)


Interesting now

735 words | ~4 min

Noah Brier's post on Managing Flow (the more dynamic, less permanent side of content management) got me thinking about what content (or flow) prioritization should look like now. The quoted section from Danah Boyd in particular sparked a thought:

This is not simply about aggregating or curating content to create personalized destination sites. Frankly, I don't think this will work. Instead, the tools that consumers need are those that allow them to get in flow, that allow them to live inside information structures wherever they are and whatever they're doing.

The thought also spun off from something a couple of days back, when I was thinking about Twitter lists. It struck me that lists are a curiously old-fashioned response to the problem of too much information. Lists always have been a valuable aid to recall or retrieval, from the times tables (learned as poetry, not maths) to the ship role-calls in the Iliad to dictionaries to library catalogues to the patent archives in the British Library to site maps (for more on lists, see Umberto Eco's The Infinity of Lists). But they're always head-dependent, whether by letter or number or subject or something else, which makes them neat but less optimal than search paradigms (strings, regular expressions, and so on).

Lists let you run; search lets you teleport. Lexicography got a lot easier when regex queries came along, just like site maps became less necessary after inurl: searching. There are downsides to teleporting, of course, especially loss of serendipity. A former tutor of mine used to say that one of the best things about the open stacks in Cambridge University Library was finding books you never knew existed, next to the ones you were looking for. When you're in a hurry, though, it's hard to dispute the usefulness of search, barring those increasing occasions where sheer volume of results means you need a guide who knows the terrain. (That's what makes good editors good.)

Twitter lists, like all lists, depend on a theory of the useful organizability of the world. (That's even true of alphabetical lists, if you think about it. It's what makes encyclopedias so weird: an organization of things in the world as if they were words or books.) The big problem is that it relies upon the consistency of that theory. The way you organize information today has to be equally meaningful to you tomorrow.

Fine if you're a medieval librarian, and you and all your books can reasonably be expected to remain in one place. Less fine if you're using Twitter on a mobile device.

Mobile devices fit in with your life pretty well, so there's no expectation that you will be in the same place or state of mind each time you use one. That means lists don't do a great job, because your theory of useful organization will keep changing.

But search doesn't do a great job either, because so much of the fun of Twitter is that library-stack feeling of discovery.

By combining list-making and the pattern-spotting ability that makes search work, though, we might just get there. What if your Twitter client had two little buttons (or some better-designed equivalent) next to each Tweet: More Interesting Now, and Less Interesting Now? Using something like the heuristic methods used to filter spam, couldn't you make a deliberately temporary organization for your tweets based on your current theory of interestingness, that would be cleared each time? Fixed lists could still be used as training wheels - for example, if you had four Twitter lists containing people you know, people you don't know, company feeds and news feeds, and you ranked a tweet from a person you know as More Interesting Now, a company feed as Less Interesting Now, that one action could cause other similar items (i.e. those in the same list) to follow suit. More interesting items could move up the list of tweets you see (Digg-style), or become more colourful, or bigger, or whatever.

Unlike spam heuristics, or lists, it's deliberately impermanent. Unlike search, it retains serendipity.

I have no idea how you'd build it, or indeed if anyone has. Just a thought, though.

# Alex Steer (08/11/2010)


The Godwinometer - measuring crazy Hitler references in real time

227 words | ~1 min

I had some spare time today, and was thinking about how planners should make things, and then I was thinking about Godwin's Law of Nazi Analogies. So I made this:

@godwinometer Twitter screenshot

@godwinometer listens to Twitter for references to Hitler. Then it works out how quickly the internet is churning out Hitler references, and expresses the result as a 'Time to 100 Hitlers': the time it's taken for the hundred most recent Hitler references to be tweeted.

@godwinometer tweets every fifteen minutes. It's sort of a measure of how crazy the internet is, right now.

(Of course, many of these could be legitimate references to Hitler by historians. But it's more fun if you don't think about that.)

The Godwinometer's written in PHP and is called by a cron job. Once it's been running for 24 hours or so it'll start auto-generating daily trended charts of activity levels in Hitlers Per Minute.

Just a bit of fun for people who like (a) the web, (b) numbers, (c) logical flaws or (d) Hitler (hmm, maybe not you). If you can find any use for it, enjoy.

# Alex Steer (06/11/2010)


Goodbye wristwatch, hello wrist

389 words | ~2 min

Interesting finding from Mintel on watches:

As many as one in seven Brits claim they have no reason to wear a watch as they use their mobile phone or PC to tell the time.

I imagine there will be a lot of coverage of this, most of it about the death of the watch.

So let's talk about the rebirth of the wrist.

Yes, we're actually only talking about 1 in 7 people, so maybe the figures are overhyped. And yes, 'many consumers view watches as an accessory rather than just a device to tell the time'. Though that only partly makes sense. A watch without a clock is a bracelet. So maybe it's good news for bracelets.

But let's run with the idea, just for fun. If the watch is dead, suddenly a whole piece of real estate opens up in your life. The space where your watch used to be. The space that you're conditioned to check from time to time.

Think about your wrist for a second. It's a part of your body that feeds you with data when you look at it. That's amazing. That's much better than having a smartphone.

If you watch enough science fiction from about the 50s to the 70s, it's full of devices whose common ancestor was obviously the wristwatch. They might be videos, radios, lasers or teleporters, but basically they're devices that make your wrist do magic. And then there are the brothers and sisters of the wristwatch, normally worn by runners and increasingly capable of recording and displaying sophisticated data.

Maybe I've just been watching Dentsu/BERG's incidental media too much, but talking about the death of the watch is just silly. If we can relegate a boring-but-useful function like telling the time to a device we can hide away when we don't need it (a phone), we can start thinking of brilliant new functions to put on wristwatch-type devices; functions that actually reward the kind of continuous partial attention your watch gets.

If you're wearing a watch, I bet you've looked at it at least once while you've read this. I'll stop before you look again.

# Alex Steer (05/11/2010)