advertisement
advertisement
The Fast Company Executive Board is a private, fee-based network of influential leaders, experts, executives, and entrepreneurs who share their insights with our audience.

My AI can talk. Can yours? 

In all the excitement over market potential, there appears to be confusion between chatbots and those systems that are truly cognitive and intelligent.

My AI can talk. Can yours? 
[naka/Adobe Stock]

There have been many cinematic-based portrayals of conversational artificial intelligence (AI), but for me, the most vivid is still 2001: A Space Odyssey. As astronaut Dave Bowman (Keir Dullea) interacted with HAL 9000 (voiced by Douglas Rain) aboard United States Spacecraft Discovery One onscreen, I wondered if I would ever see anything like that in real life.

advertisement
advertisement

Well, here I am in NYC, looking at my flatscreen monitor, and the data display is shifting as I communicate back and forth with ASK, Crux Intelligence’s embedded AI. (Fortunately, ASK has significant guardrails in place—unlike HAL 9000 in Kubrick’s celluloid interpretation of Arthur C. Clarke’s original novel.)

AI MARKET SIZE 

According to PricewaterhouseCoopers’ Artificial Intelligence Evolution: Main Trends report, the AI market is expected to reach a value of $53.1 billion by 2026. This includes the intelligent virtual assistant (IVA)/conversational AI market, which is expected to reach a value of $23 billion by 2027.

advertisement
advertisement

These are significant numbers, but in all the excitement over market potential, there appears to be confusion between chatbots and those systems that are truly cognitive and intelligent.

COMPLEX COGNITION

So what makes for true cognition within conversational AI?

advertisement

Firstly, the AI must be built on actual computational linguistics—unlike a chatbot which is just following a predefined sequence of “if-then” commands.

The Stanford Encyclopedia of Philosophy defines computational linguistics as “the scientific and engineering discipline concerned with understanding written and spoken language from a computational perspective, and building artifacts that usefully process and produce language, either in bulk or in a dialogue setting.”

Simply put, each textual element must be translated machine-speak, using mathematical-based symbols so the machine can “read it” and then deliver the correct answer.

advertisement

It also needs to employ what’s known as word sense disambiguation, which ensures the AI can understand the context of each word or phrase and how it relates to the conversation as a whole. As you might imagine, this is a rich field of research. Human nuance is very tough for machines to decode, especially when terms might be vague or have several meanings depending on the situation.

This is why conversational AI is “teachable” and improves over time. It’s not drawing on a general database; it is ever-expanding and sector-specific, because every industry has its own vocabulary. The AI must display implicit learning through experience, self-correcting via a feedback loop to improve the quality of its output over time.

VOCAL PROWESS 

advertisement

Many of us prefer to talk rather than type, but machines that have not been exposed to specific accents or pronunciations often find us difficult to understand. A necessary test for conversational AIs is how well they handle ambient or background noise—few customers are sitting in a soundproofed room, after all.

If you’re licensing a business intelligence AI system, ensure that the managed services agreement contains ongoing acoustics training to handle anomalies. There’s nothing worse than bringing on a new sales chief, but your AI can’t cope with their Bostonian brogue.

Conversely, your AI’s synthetic voice might be incomprehensible to your latest off-shore hire, so it’s worth having options to switch out audio presentation styles when needed.

advertisement

Speaking of vocal styles, there’s also the tricky issue of hidden bias due to perceived gender. It’s not lost on many of us in the AI industry that both Siri and Alexa are routinely referred to as female, as was their historical antecedent, ELIZA, developed by Dr. Joseph Weizenbaum at MIT in 1966.

As an aside, my team and I decided to modulate a non-gendered voice for our AI (yes, it is possible). We wanted an AI that feels like part of the team with a neutral, yet helpful, tone for our business customers both here and abroad.

THE FUTURE OF AI

advertisement

As you may know, Siri was not birthed by Apple. Rather, it was spun out of Stanford’s SRI International in 2003, under the aegis of the U.S. Defense Advanced Research Projects Agency’s $150 million program to develop a CALO—that is, a cognitive assistant that learns and organizes.

Conversational AIs are proliferating in the marketplace, but the real test is how well they’re able to handle cognition, intent recognition, natural language processing, word disambiguation, and a range of inference mechanisms from implicit to explicit learnings.

Anything else is a chatbot, which is fine if that’s all you need. A lot of chatbots are great at their job. But when you’re in the market for an AI, check under the hood before you (metaphorically) drive it off the lot.

advertisement

Kathy Leake is the award-winning Founder and CEO of Crux Intelligence, a next-gen AI-powered business intelligence solution. 

advertisement
advertisement