31 March 2016

The new physical UX

Tags:UX

There’s no question: machines are getting better at understanding us. Through Big Data and very clever clouds, companies and their technology are working out not only what our intentions are, but also how they can deliver a personalised response that’s more likely to elicit a reaction from us. Because, ultimately, it’s about how to make us buy more.

So whether it’s by serving up such items as Amazon’s ‘Customers who bought this item also bought …’ and Google Ads that pop up on unrelated pages based on other pages you’ve visited or gleaning insights from reviews or your purchasing history, personalisation is here to stay.

Tailoring to you

The clever part is that technology is tailoring information for us not by us consciously indicating a preference but by what we do through our many digital devices. Companies such as Google and Apple have built up a good understanding of our likes, preferences, and habits this way. And they’re harnessing this data and giving us back customised information.

So as you move around a city, for example, Foursquare will pop up on your smartphone and say, “Hey, you’re near a restaurant we think you’ll like, based on the fact that you liked this other restaurant.” And we’re going to see this on a more and more granular level. Because these machines know where we are and what we’re doing, they can make very specific suggestions. Big Brother is indeed watching.

Let’s get physical

The physical user experience (UX) part comes from this understanding of you and your natural world. I don’t pick up my phone and tell it, “I’m at Five Dock” or “I’m in the CBD”. It knows. Similarly, with Google Voice search and Apple’s Siri, you get much more tailored responses because these virtual assistants are contextually aware and are drawing on both my data and other available information. So, for example, I can ask Siri and Google Voice, “Do I need an umbrella tomorrow?” They’ll know from my calendar that I’ve got a 2pm appointment tomorrow in Ingleside, and if rain is forecast there, they’ll tell me to bring my brolly.

This is the physical UX we haven’t had in the past. We’ve had fingerprint sensors for years; we’ve had haptic feedback for decades. What’s new is this ability for systems to understand us without us explicitly giving them a command. And pair this with the massive data crunching of, “Hey, last week, Jules bought this watch” or “He shopped in this store” or “He reviewed this restaurant on Foursquare”. Based on that type of information, where you are, and what you’re doing, machines can now chime in with suggestions that may be useful to you.

Leave Us a Comment

Other stuff you'll love

All news articles

Contact us

We look forward to hearing from you

We take your privacy seriously
Back to the top of this page