Way back in 2010, Apple spent some of its fast-amassing cash pile to buy Polar Rose, a face recognition firm from Sweden. Now it seems it’s been busy ever since incorporating Polar Rose’s face identification and tracking algorithms into iOS5–its upcoming revision of the operating system that powers iPhones and iPads. So deep is the integration–it’s far beyond a simple app–that there’re API handles.
This is huge news, for all the reasons that Google’s use of face recognition in its online offerings could change much about the web. By adding controls into iOS’ API, Apple’s allowing third-party apps to access the core face recognition tech. Code like “hasLeftEyePosition,” “mouthPosition” and the image-processing for identification means that apps can track faces and also recognize users.
This means games can track face positions for an unusual mode of input, apps like Instagram could automatically tag people’s faces they can identify, smart video apps could use facial cues to do digital image stabilization and so on. In more interactive modes, we can even imagine iOS face IDs on an iPhone being used as an automatic log-in on a paired Mac. And it’s even plausible that Apple may be using facial recognition as part of its secure user authentication for future wireless wave-and-pay systems, which we know it’s been working on.
But wait, there’s more. Another relatively recent Apple purchase, Siri, is also showing up in the latest developer builds of iOS5, alongside evidence that Apple’s including code it acquired as part of its deal with voice recognition experts Nuance. Siri was a highly promising smart personal assistant app, and until now it’s entirely disappeared, so the fact it’s showing up in iOS5 is interesting. And it could be transformational. Because what Apple seems to be doing is enable smart voice control in iOS5 along the lines of “set up a meeting with mark on wednesday at 11 a.m.,” where Mark is a user contact. There’re also text-to-speech powers, which could be really important for using your phone while driving–we can imagine an iPhone reading out incoming SMSs, and also a smarter integrated navigation app (which we know Apple’s also working on).
In this sense, Apple’s moving the iPhone and iPad toward the famous Knowledge Navigator concept it created back in the 1980s. And we are thus tempted to think it’ll only work in full on newer devices–possibly just the iPad 3, the upcoming iPhone 5 (and maybe the current generation too): Apple prefers to make its enhanced user experiences “all or nothing,” implying that the degraded performance older devices offer for new high-tech software is too disappointing to users.
And it’s also a powerful new weapon in the war against Android tablets and phones. When the Android Nexus One first emerged, we called its integration of voice control an important secret feature. But it’s never been properly realized, much less spun into the tightly integrated smart “digital PA” which Apple seems to be working toward. By adding in all this tech, Apple’s enabling all sorts of clever marketing angles, and is even appealing to business users a little more–something it seems keen on at a corporate level.