Rain in Tokyo - Personalisation & the Web
Something magical happens every time it starts raining in Tokyo. Convenience store entrances sprout umbrellas, store clerks dream umbrella bags into being, and commuters flock to their retail venue of choice to shelter from the rain, and to equip themselves for the journey home.
To my simple mind, this is one of the most purposeful and delightful demonstrations of User Experience Design.
By placing umbrellas at the front of the store when it rains - companies benefit from increased umbrella sales (and the halo effect which I suppose means they probably sell more fried chicken when it rains too) and customers benefit by getting home dry. It isn’t creepy, it isn’t exploitative, it isn’t predatorial, it is a direct and immediate response to a clear and acute customer need. Would it be better or worse if the hardiness of umbrella matched the intensity of the weather, or if the style of umbrella took into account ones individual preference in precipitatorial fashion? What if the instructions on the umbrella were always in the language of the person the umbrella was being handed to? Now we’re describing a rain-based utopia of which we can only dream, one which we’d all agree would be a better world than the damp, wet-trousered version we must daily wade through. What if we take this same experience and applied it to how we build websites. Would it still be delightful, or would it be creepy?
When you visit Booking.com, we offer you a set of recommended destinations which we think are the ones you’re most likely to be interested in. We eye you up as you walk through our metaphorical front door, and then we rush out back to the store room to bring forth those products which we think you came here to buy. We don’t always get it right (unless you’re that one guy in Lille who only ever travels to Paris), but our intention is to always give you the best possible experience, which means trying to alleviate as much of the effort of finding a place to stay as we can. Is that creepy? Booking.com wins when our users book a place to stay, our users win when they find the perfect place for them. In this symbiosis of supply and demand, everyone wins when our recommendations are great.
So then what drives the sentiment in the mainstream & tech media which tells us that techniques like the one I just outlined are creepy, that they’re invasions of our online privacy, or that they are forcing customers into a self-fulfilling spiral of personalisation - the Google Bubble. When Instagram followed in the footsteps of Twitter & Facebook in deploying an algorythmic ordering of the photo feed, users were in uproar. Beyond simple change aversion, users cited feeling like they were “missing out”, despite missing out being the core problem Instagram were trying to solve when they flipped the switch, to make sure people see the content they care about most, and less of the content which just so happens to be posted in the minutes before a user opened the app.
And yet whilst seeing a man appear with a fist-full of elongated plastic tubes in which to stuff your dripping brolly is a wonderful moment, launching a photo sharing app to see the photo your sister posted of your first-born niece 8 hours ago is regarded by many as a change for the worse.
So what drives this difference in reaction. Is it a mis-trust of tech companies, inspired perhaps by these kinds of stories - where what a company knows about a customer is used not to benefit both parties in the transaction - but to benefit only the seller. Convenience stores in Japan do not raise the prices of umbrellas when it starts raining, or when a mack user is walking by.
Maybe it is the general lack of knowledge around how the internet works in the minds of the average consumer. The language which describes the techniques tech companies use to tailor the user experience is the same language as spoken when referring to state surveillance programmes - tracking, cookies, fingerprinting. Can we expect customers to differentiate between bad-actors and good-actors leveraging the same technology and terminology. Can we expect customers to believe that a cookie placed on their machine by Amazon is placed their in service of making a subsequent visit better, and not just to harvest their data to be used against them?
Perhaps it is something more abstract than that - the lack of a human interaction to frame the personalisation, a person who reassures you with a ‘these shoes would look great on you’ as opposed to a chunk of UI which says ‘You bought X, you’ll probably like Y’.
It feels to me like we’re entering into an era of great paranoia on the web
Fuelled by nefarious companies operating in the grey areas of morality and ethics & nation states who boast surveillance programmes which would spook even the Stasi - but it means that those of us who operate anywhere on the web face a customer base suspicious about our intent and our integrity. Can we design around this problem and inspire confidence in the amiability of our algorithms? Do we have to wait for a post-privacy generation of consumers to come along without the physical store context driving their perceptions of retail and retail optimisation?