A firm called SOASTA, which dubs itself the “leader in cloud-based performance and functional testing” has some news today about a release of its CloudTest Platform–something that “for the first time” allows “functional test automation for continuous multi-touch, gesture-based mobile applications.” Multitouch, gestures, apps, and the cloud all in one thing–it’s a tech writer’s heaven. Within the news, though, are a couple of important trends, connected to the development of smartphone and tablet technology.
SOASTA’s tech is based on something it’s called TouchTest, and the idea is to very precisely “capture and playback” all of the “continuous touch gestures including pan, pinch, zoom, and scroll” that you may get up to if you’re an owner of an iDevice or an Android phone or tablet. The company argues that previously when this sort of test has been tried it used optical techniques “that are not precise enough to test this generation of mobile apps” and that, because its system lives inside the device itself, it can capture more detail about the fine moves of user fingertips, thus “replacing brittle optical recognition approaches.”
Alongside this it’s got a Private Device Cloud system that lets you use the devices you already own to test “end user experiences from real devices around the world.” It’s aimed at enterprise, mainly, so that employee devices can be used to crowd-test applications, so there’s an administration console as well as the actual app testing system.
This is all very neat, and though Apple’s very strict UI requirements mean that it’s rare to find an app with an interface that is horribly clumsy to interact with via touch, they definitely do occur … and Android is famous for having issues like this dotted among its apps. A test environment like this would definitely let app designers hone their interfaces.
Plus, when apps were simple, only accessed by users numbering in the hundred or thousand and not particularly critical (in terms of usage, whether it was business use or for more sensitive situations like in medical environments) then the way the interface worked wasn’t incredibly vital. But as the app economy grows up, people are relying on apps more and more to help them communicate, to run their businesses, to interact with customers, to diagnose medical conditions, and other much more significant uses. In these situations, having a precise and responsive UI is vital to helping the app work, or to keep and delight a customer, and so on.