Yesterday afternoon I had a chance to try on a prototype of Will.i.am’s new Android bangle, the PULS. Like post-“Joints & Jam” Black Eyed Peas, it is egregiously terrible and borderline offensive, with a clunky UI and even clunkier hardware, all powered by a halfhearted, mood-sensing technology designed to “spread positivity” and “make the world a better place,” whatever that means. It is the aging American Idol hopeful wearing a slouchy beanie inside the karaoke bar, belting out face-scrunching renditions of “Where Is The Love?” to the backs of strangers.
But I won’t waste any more of your time with a takedown. My colleague John Brownlee already ran through the specs–and they are as awful as advertised–at Co.Design.
If the smartwatch has one thing going for it (and it very well might not!) it is the mood-sensing software tucked inside of it, which was developed by an Israel-based company called Beyond Verbal.
If you don’t mind the stares of people around you, the PULS has a mood-reading feature where you can talk into it for 20 seconds (saying whatever you want, which is strange), which Beyond Verbal’s acoustic-identification software will then assess. The company describes this as “emotions analytics,” and it’s already built into an app you can download today. It essentially relies on vocal intonations to characterize your emotional state (sad, happy, annoyed), and is compatible, I’m told, with 32 languages.
Will.i.am seems to think that, once PULS hits a certain saturation point (which, Lol), people can use it to find “happy places” like a club or party. Or they can, for example, send their PULS-wearing friends positive vibes when they’re in a crappy mood.
If that sounds like unenlightened nonsense to you, well, rest assured you aren’t alone. Yet Will.i.am’s wishful thinking aside, emotions analytics could have far-reaching applications in far more boring and useful fields, especially in the nascent world of human-to-machine interfacing. (Investors certainly seem to think so: In September, Beyond Verbal raised $3.3 million in funding.)
Say you’re calling up an automated system like Bank of America. Emotions analytics could in this instance be used to influence the decision tree of options that the robotic voice on the other end uses to assist you. If you, the customer, sounds increasingly annoyed as the call drags on, the BofA robot can enact a chain of suggestions to help you calm down so that you don’t cancel your account. (Or maybe even help you! But probably not.)
Or say in the future when voice assistants are a common feature in our cars. If your in-car dash can sense that you are stressed or annoyed after a long day at work, it can cap your top speeds so that you won’t speed through a school zone and hit anyone.
That’s all hypothetical for now, though. Beyond Verbal CEO Yuval Mor says, “Cognitive language is a poor emotional yardstick,” which is kind of true. Siri, Cortana, and Google Now currently have to take your words at face value. So while Will.i.am’s PULS smartband is poised to become a commercial flop when it’s released in the next few weeks for an undetermined price tag, know that there is at least a little something in there to kind of–sort of–love about it.