Multi-touch, gestural interfaces are the new black. And for the next four to five years, they’re the immediate future of our ever-evolving human/computer interactions. But for us designers, I’d like to project a little further into the future and discern an even more likely scenario: true sense integration on mobile and desktop computing devices.
As designers, we usually only get to consider how media looks, sounds, and feels in a mildly tactile sense. In the future, we’ll be able to consider these variables at a much greater depth and dimension than that of a static, unchanging substrate. I also wouldn’t be surprised if smell and taste gained much greater prominence in the designer’s arsenal.
Specifically, there are certain kinds of interactions regarding mobile and desktop devices that don’t seem very far off from a technology standpoint. They do, however, require weaning us off the idea of doing our computing through a screen-topped device with a gestural input mechanism. Multi-touch interfaces don’t have a ton of utility if you have disabilities, and definitely don’t exploit other mechanisms we humans have for conveying and receiving information.
Here’s what I’m dreaming of…
An earpiece that doubles as a phone and really understands what I want.
I don’t always need to see the Internet to be able to grasp the information from it.
If you’re looking to access the visual Internet, the iPhone dominates the field for ease of use and clarity and will likely be the gold standard for some time. But what if I’m going out on the town and don’t want that phone in my pocket? Make the earpiece a phone as well, and pair it with trainable natural-language voice recognition software driven through the cell-phone network that learns my voice, my needs, and my quirky slang.
I could imagine the earpiece phone recognizing commands such as “give me turn-by-turn directions to Pacific Place,” “pay my cell phone bill with my credit card,” or “text my friend Joanie that I’ll be twenty minutes late” and it will be smart enough to fulfill your actions without any major hiccups.
This is a true expression of cloud computing separate of the desktop and is where Google is starting to lay the ground with services such as 1-800-GOOG-411, which they claim is a not-for-profit venture, but makes a heck of a lot of sense in their long term strategy for having a universe of cloud-driven Internet tools that have great utility for a broad audience and further help them sell search advertising.
Knowing how excited people get about these kinds of interfaces, I could see them being smart enough to recognize patterns of behavior and quietly prompt you: “Did you mean to pass by the cereal aisle? I know you like Lucky Charms.” (Okay, that would be scary…)
A touch interface that communicates through sense of touch, not screen activity.
What’s the weather going to be? I go to the weather service on my phone, and when I touch the screen to see what the upcoming weather’s going to be like through the weekend, the surface of the touch interface gets hotter or colder depending on the time period my finger hovers over. Sounds frilly, right? Sure, if you aren’t blind. Blind people should be able to ask their phone, “What’s the temperature going to be tomorrow?” and have the phone adjust its heat output in relation to today’s temperature to indicate the relative difference.
Another example. Let’s say I’m considering taking SR-520 over I-90 to get to the Eastside. I ask my phone (using my voice interface) how the traffic is on SR-520. The steering wheel gets harder by 30%. Should I take I-90 then? The steering wheel softens dramatically. There are other ways of getting data instead of me barking orders to my phone/car/computer, then having it bark at me a series of choppily-voiced words, which are interrupting my enjoyment of the new MGMT album.
Yes, the multi-touch gestural interface is very cool and gets rid of that mousy thing on the desk. But I want more sense out of my touch interactions.
Forget the idea of the phone altogether. It’s part of the devices around me.
I know phone manufacturers want to make money from our phone networks that require devices that earn money for large publicly traded companies through the use of night and weekend minutes… but doesn’t that idea sound… quaint?
I’d be perfectly happy if phone calls followed me from device to device around me, instead of me having to carry a device around in my pocket. Sure, there is the love that I’d lavish on a phone as part of my technological pocket arsenal next to the iPod, the (soon to be smart) wallet, my house keys, my sketch notebook, and my pack of mints. But I’m of the “less is more” camp, and less means no phone whenever possible.
Since I’m Gen X, I’m cool with being a little out of touch. I’m already seeing that use of cell phones will stratify, with phones being generated for the youth as part of their uniform, while from Gen X on up, it’s seen as a necessity, not as an entertaining activity. Higher-end luxury phones will be wispy, while phones for the youth will be badges.
But really, I’d like to get rid of the word phone altogether. Or at least call this new category of devices something else. The whole beauty of the term “mobile device” is that you don’t have to say it’s a phone/MP3 player/GPS/Knife/Wii remote. Let’s just tack the word “multi-sensory” onto mobile devices and hope that the device manufacturers can pay it off with something that delivers some real utility to us technology junkies.