For a long time, the smartphone industry was split. Apple’s iPhones had relatively small screens, while Google’s Android phone screens kept getting bigger and bigger. But with the 4.7-inch iPhone 6 and the 5.5-inch iPhone 6 Plus, that divide is gone. Big is in.
For software developers who’ve optimized for the iPhones of yesteryear, the ensuing task isn’t as simple as just magnifying a small existing interface onto a bigger screen. In talking to some of the best UI designers in the industry, it’s become clear that as screen real estate has stretched beyond the reach of one thumb, it will impact topics like button placement, gestures, and content layout significantly. Here’s what to watch out for in the next wave of apps to come:
Right now, it’s still relatively common for an app to stick useful buttons in the top corners of your screen. A throwback to desktop interfaces like Mac and Windows, you’ll see a range of popular apps, from Twitter to Apple’s own Photo app, place all sorts of functions in a top navigation bar. But on a large screen, that top bar is too far to reach with one’s thumb.
“We suspect a lot of apps are going to start thinking about putting the main engagement options on the bottom of the screen rather than top or upper right,” explains Mark Kawano, a former Apple designer who recently tackled this exact problem in designing Storehouse for iPhone. In Storehouse, an app for publishing rich multimedia stories on mobile, Kawano’s design team stuck nearly every single function into not just one, but multiple bottom navigation bars that will appear at various moments when navigating through the app.
Kawano isn’t alone in this thinking. Google Product Director Luke Wroblewski points to Google’s new Material Design principles to back Kawano’s logic. What Google dubs “Action Buttons” are either placed a third of the way down the screen, or, on mobile devices, all the way at the bottom right-hand corner (that’s so close to a righty’s trigger-thumb that the user will need to scrunch it to press the button).
But Loren Brichter–known for inventing Twitter’s pull-to-refresh gesture–warns that moving controls to the bottom isn’t always the best option. “There’s a tough tradeoff with UI-at-the-bottom. People read from the top down, and I’ve found many people may completely miss stuff at the bottom of the screen,” he says. “If you can focus users’ attention, it’s a good way to solve the problem.”
He adds: “Another way to do it–which may sound a bit unorthodox, but worth experimenting with–is to simply always allow for a certain amount of over-scroll for content. So the user can physically drag content down into thumb-reach without using the hacky and completely unintuitive Reachability gesture.”
While moving a button to the bottom of the screen is often a good option, it’s not always the best option. And there’s one surefire solution that can get someone where they need to go, whether it’s a screen that’s five inches or a screen that’s 50 inches. A gesture–or more specifically, a simple swipe.
Not long ago, Brichter lamented that UI designers weren’t pushing the capabilities of our simple finger swipes forward aggressively enough. All sorts of menus and functions could be hidden within finger movements. And now that we’re at a point where the entire upper end of the screen has grown useless for buttons, designers have never had more incentive to rethink what gestures can do.
Mark Kawano points out that Apple actually added a swipe-right-to-go-back gesture into iOS 7, which was unveiled last fall, “presumably because they knew that bigger iPhones were coming.” Storehouse built on Apple’s momentum in a clever way: Any time you’re in a story, you can swipe in any direction to exit out. It’s subtle and liberating on a small screen. But on a bigger screen, Kawano sees the gesture as a necessity.
On phones, whether you’re in portrait or landscape mode, content is generally presented to you in one pane. There’s a single window that’s filled with a stream or spread of information. But on iPads or Android tablets, it’s quite common to see apps running in splitscreen mode.
Ashish Toshniwal, CEO of Y Media Labs–makers of AAA mobile apps for PayPal and Amex–believes splitscreen will invade apps because Apple is supporting the feature more with iOS 8, and already implementing it in the supersized iPhone 6 Plus. Native apps like Mail run in splitscreen mode, and developers will have access to new API-level tools to take better advantage. Now, Apple’s software-development kit will allow developers to specify whether or not to go splitscreen, not based upon whether something is an iPad or iPhone app, but how large the device’s screen is. (So you could have an app that’s splitscreen on the iPhone 6 Plus but not the normal iPhone, for instance.) And he points to splitscreen as a barely tapped opportunity for customizing one’s app to scratch very particular user needs.
“Consider using a GPS app in the hands free mode. The iPhone 6 Plus user would be able to take advantage of the split screen to run their GPS to provide a more comprehensive navigation experience,” he writes. “For example, text-based directions, alternate routes, accident reports or even music controls in one view mode.”
Wroblewski pointed us to a blog post he wrote, which seems to concur. Splitscreen modes are a UI opportunity dictated by simple ergonomics. When holding a tablet or a “phablet” in landscape mode, you need to use two hands in order to balance it. This enlists not one, but two thumbs for UI navigation. Why not give them both something to do?