Future of Mobile 2008

November 18, 2008

I really enjoyed Future of Mobile yesterday.

The day started a little sluggishly with a well-qualified panel discussing the future of mobile operating systems. I didn’t feel I learned much here – revenues of the panelists businesses weren’t particularly exciting, and aside from an interesting conversation around runtimes I didn’t feel I learned a great deal.

For me, things really started to take off with the presentation from Doug Richard of Trutap (disclosure: they’re a client of ours). Doug was talking about the rise of a middle class in the developing world that shares aspirations with the middle classes everywhere, and quietly pointed out our arrogance in assuming that it could be otherwise. I particularly liked his notion that Western operators would adopt defensive positions and hence take fewer risks (and be less innovative) than those coming out of India.

I didn’t devote much attention to Matt Millar from Adobe, I’m afraid – sorry Matt, but I was doing last-minute panicking about my own presentation. I’ve not watched the video yet, but whilst I’d spent more time preparing than I ever have in the past (and felt the slides were reasonably polished), I made the mistake of over-planning what I was going to say. Normally I work from bullet points and just chat around them (something I’m comfortable doing) but after my hour-long overrun at the Werks talk a month or so back I tried to restrain myself by planning what I’d say in great depth. The upshot was I felt like I was working from a script, and had to keep checking where I was, staring at a screen instead of talking to the audience. Lesson learned there, but at least I managed to get my macaroons-as-analogy-for-porting slide out.

The bloggers panel was a really good format: 6 bloggers, 6 minutes each, mirroring blogging itself. Really nice to hear Vero Pepperell evangelise a human approach to communication – as an industry we ought to know that stuff, but I can’t help feeling we need someone to gently beat it into us on occasion. Helen was righteous – nuff said.

A lunch, or non-lunch, followed. If there was a weak point to the day overall, I’d say it was the facilities. I heard plenty of people complain about a lack of wi-fi (though as a 3 USB dongle owner I managed OK), there was no lunch provided, and no coffee in one of the coffee breaks. Fortunately Kensington is full of restaurants and cafes, but it would’ve been nice to hang around in a throng during all these breaks. The auditorium itself was excellent – a lovely space, good sound, and power to most seats.

Rich Miner gave a great talk in the afternoon, filling in a bit more detail around Google and their plans, and drawing on his own history launching the Windows SPV Smartphone when he was at Orange. He gave a good if negative insight into the world of operators when he talked about product managers feeling threats from new product developments and derailing them.

Interesting also to hear about his take on mobile web apps – that they fail for reasons of network latency, lack of local storage, and access to device capabilities. Whilst you can see efforts in Android, PhoneGap and OMTP Bondi to address some of these, it’s a little way from the “web apps as future of mobile” angle which I’d heard Google were adopting.

And similarly it was good to hear Rich quizzed on the topic of Android and fragmentation by David Wood (who’s more qualified to talk about this than he?). Rather than espousing the rather bland “we don’t think fragmentation is in the interests of the industry” line I’ve heard from Google before, Rich talked about the value of having a reference implementation by which to judge others; a conformance test being introduced for OEMs; and the use of challenging and popular reference apps to provide a “Lotus 1-2-3” style evaluation of an Android implementation.

Tomi Ahonen was hilarious and upbeat as usual – full of detailed and slightly threatening stats on the hold that mobile has on us, and case studies of fantastic things launched elsewhere (usually Asia). The Tohato “worlds worst war” was my favourite: purchasers of snack products fighting one another in vast virtual armies, wonderful.

James Whatley saluting audienceAnd the day finished with another panel discussion: lots of disagreement from qualified folks who’ve been doing this stuff for years, including two of our clients. We had some kind words said about us by Carl from Trutap and Alfie of Moblog fame – thanks guys! – and it was particularly interesting to hear the pendulum of fashion swing back towards applications, away from the mobile web. I wonder how permanent this effect, which is surely down to the iPhone App Store, will be?

The evening party followed, carrying on the upbeat atmosphere đŸ™‚

My slides from the day are online here. The lens-tastic Mr Ribot took video footage of the talk which you can see here, and I heard a rumour that the official footage from the event may go online some time too.

Thanks to Dominic and all the team at Carsonified for the hard work they put into the event – I know all too well from Sophie how much this takes, and they did a cracking job. And a particular yay to Mr Whatley, who stepped in as compere at the last minute and did an excellent job of keeping the audience engaged, even in those sleepy after-lunch slots đŸ˜‰

Enterprising Engineers

November 7, 2008

I’ve just gotten back from a wonderful little event at Sussex University, called Enterprising Engineers.

It’s organised the luxuriantly bearded Jonathan Markwell of Inuda, who seems to have mastered the dual skills of fitting 36 hours into the working day and either attending or organising every digital media event in town. The format was good fun: nibbles and booze followed by a 3-person panel taking turns to talk for 10-20 minutes, and taking questions from the audience.

Tonight it was myself, Dan from Angry Amoeba, and Glenn from Madgex – all talking about products.

Dan held forth on his philosophies towards startups, how it’s affected his working life, and finished with a demonstration of his product, Tails (a beautifully simple-looking bug tracking system).

Glenn followed up with a roundhouse presentation to the face, giving the back-story to Backnetwork, a product Madgex built to connect folks at events which is nowadays languishing without further development, but nonetheless earning its keep as a tool for bartering in exchange for sponsorship.

And then I wibbled somewhat – for my Macbook decided to crash, preventing me from showing the Ghost Detector video that should have introduced my skit. I rambled on about our experiences launching Ghosty in the US, then Flirtomatic and Twitchr: the unifying theme being the complexity and engineering problems which can be lurk behind even the inane, the lewd, or the playful.

I spent last Monday at Channel 4’s offices, taking part in the Mobile Game Pitch organised by Channel4, EA and Nokia. For once, I wasn’t the one doing the pitching: I went along as a mentor, helping 4Talent’s young producers prepare for their presentations.

It ended up being a long and intense day, but there were plenty of positives.

One was meeting Scott Foe – considering that he is recognised as the highest profile game producer in the mobile industry, I should have know of him before. Still, he turned out to be a great character, had some nice American sigarettes and gave an insightful presentation about the mobile games industry:

It was great to listen to the 8 game concepts selected for the final, mostly because they game for people wihout mobile background. They were ideas in their infancy, a few of them with some potential, but they show a growing understanding and interest in mobile and its potential.

  • The importance of creating trans-media characters
  • The significant difference in the marketing of console games (shock & awe) v.s. mobile games (sustained trickle)
  • The importance of word of mouth
  • The significant value associated to game concept creation – i.e. pre-production
  • The 2.5 yrs he and Nokia have invested in the development of Reset Generation (I now have to play it)

Working with the W3C

November 2, 2008

Back in September, I mentioned that I’ve been invited to work with the W3C Mobile Web Best Practices Working Group, specifically to help with Content Transformation (CT).

It’s a really contentious topic. The event which I think provoked the whole discussion was Vodafone foolishly deploying a transcoder which prevented mobile sites from identifying the device used to access them: effectively breaking large chunks of the mobile web. A particularly nasty aspect of this was that the sites most badly affected were the ones which had been specifically written to deliver the best mobile experience.

The W3C CT group is creating a set of guidelines that deployers of transcoding proxies and developers can use to ensure end-users get the best possible experience of mobile content. Involved in this effort are parties from across the mobile value chain, though mostly from larger organisations which tend to participate in these sorts of things. I’m there to try and ensure that smaller parties – content owners and mobile developers – are better represented.

There have been other attempts to put together similar guidelines – the most prominent being Luca Passani’s Rules for Responsible Reformatting: A Developer Manifesto, which has quite a few signatures from the development community, as well as a number of transcoder vendors. There’s a great deal of overlap between the contents of Manifesto and the CT document. I think this is because the two are concerned with a quite specific set of technologies, neither are trying to invent any new technology, and both have the same aim in mind: to ensure that a repeat of the Vodafone/Novarra debacle, or similar, doesn’t recur.

What I like most about the CT document is the responsibilities it places upon transcoder installations, if they’re to be compliant – and with Vodafone in the CT group, I think it’s reasonable for us to expect them to move their transcoders to compliance at some point. The document is still work-in-progress, but right now some of these (with references) include:

  • Leaving content alone when a Cache-control: no-transform header is included in a request or response (4.1.2);
  • Never altering the User-Agent (or indeed other) headers, unless the user has specifically asked for a “restructured desktop experience” (4.1.5);
  • Always telling the user when there’s a mobile-specific version of content available – even if they’ve specifically asked for a transcoded version of the site ( I think this is lovely: as long as made-for-mobile services are better than transcoded versions (and in my experience it’s not hard to make them so), users will be gently guided towards them wherever they exist;
  • Making testing interfaces available to developers, so that content providers can check how their sites behave when accessed via a transcoder (5)

There’s also a nice set of heuristics referred to, which gives a hint to content providers of what they can do to avoid transcoding.

The big bugbear for me (since joining the group) has been the prospect of transcoders rewriting HTTPS links, which I believe many do today. I’ve been told that in practice Vodafone maintain a list of financial institutions whose sites they will not transcode, presumably to avoid security-related problems and subsequent lawsuits – which would seem to support the notion that this is a legal minefield.

The argument for transcoding HTTPS is that it opens up access to a larger pool of content, including not only financial institutions like banks which absolutely need security, but also any site that uses HTTPS for login forms. Some HTTPS-accessible resources do have less stringent requirements than others (I care more about my bank account than my Twitter login, say), but it’s not a transcoders place to decide when and what security is required, overriding the decisions a content provider may have made.

The CT group has agreed that the current document needs to be strengthened. Right now it is explicit that if a proxy does break end-to-end security, end-users need to be alerted to this fact and given the option of a fully secure experience. Educating the mass market about these sort of security issues is likely to be difficult at best; I take small comfort from the fact that they’ll be given a choice of not being forced into an insecure experience, but this still feels iffy to me.

And security isn’t just for end-users: content providers need to be sure they’re secure, and beyond prohibiting transformation of their content using a no-transform directive there’s not much they can currently do. So I suspect there’s more work cut out for us on the topic – and the amount of feedback around HTTPS would seem to confirm this.

The fact that we need to have either the CT document or the Manifesto is a problem in itself, of course: infrastructure providers shouldn’t be messing with the plumbing of the mobile web in the way that they have been. But given where we are right now, what are we to do? Luca’s already done an excellent job of representing the anger this has caused in the mobile development community; I hope the CT work can complement his approach.

I’m also going to write separately about the process of participating in the group; I’ve found the tools and approach quite interesting and it’s my first experience of such a thing.

Applications and stores

October 28, 2008

There’s so much news right now about mobile applications and stores, it feels like time to take stock.

When iPhone launched and I got my greedy mitts on a jailbroken Shiny from the US, one of the things I liked most about it was the dodgy “installer” app which the kind man who jailbroke it for me put there. At the time it definitely felt like the best experience I’d had of downloading and installing third-party mobile applications, and Apple have gone on to improve on it with their official Application Store.

Conventional wisdom had been that users don’t download mobile apps, a generalisation which flies in the face of our experience; we know we’ve had well in excess of 750,000 downloads of apps incorporating our Cactus UI library to date, plus the installations we’re unable to track ourselves. And our experience isn’t unique. But there’s definitely been some problems taking owners of conventional smartphones through the process of downloading and installing an application:

  1. Text into a shortcode
  2. Receive a WAP Push or text message
  3. Find it, open it, click on the link
  4. Ignore the warning that you might go online
  5. Pray your mobile phone has correct connection settings
  6. Go online, wondering how much this is costing you
  7. Find out your phone isn’t supported (whatever that means)
  8. Wonder what all the fuss is about

So… application downloads to date have been by customers who are educated enough, driven enough or persistent enough to deal with this infernal procedure.

Just one more thing…

I love the mobile web. It’s getting better every year as devices and networks improve, it’s still got a long way to go, and it’s the most cost-effective means of getting a mobile service launched.

But isn’t it strange that Apple are getting massive success selling applications via a an application itself – that they’re not selling and distributing iPhone apps via the web, either on the device or through iTunes? And it looks like Google are taking the same tack.

Isn’t this a pretty strong endorsement of application as a route to online content, rather than the web? And isn’t the success Apple has enjoyed with their application store testament to the fact that even in situations where the web might provide a perfectly serviceable experience (such as e-commerce), applications are a better route to take? Not that I’m suggesting we don’t need to wait and see on this one, or that there won’t be problems down the line as the quantity of content available via these stores increases.