The tool, called Cortex, uses a graphical user interface to make it so that building an AI model doesn’t require a PhD. The honeycomb-like interface, designed by Mark Rolston of Argodesign, enables developers–and even designers–to use premade AI “skills,” as Rolston describes them, that can do things like sentiment analysis or natural language processing. They can then drag and drop these skills into an interface that shows the progression of the model. The key? Using a visual layout to organize the system makes it more accessible to non-scientists.
“Stringing together things is a thing even a child learns,” explains Rolston. “By simplifying that orchestration aspect, the stuff that’s going to stay hard–like the data transforms–are easier to understand. How they relate to each other is visually explained to the user.”
Right now, AI algorithms are buried inside complex code, but creating a graphical user interface is a crucial step toward enabling more different types of people to become the architects of machine learning models as the technology begins to infiltrate our lives. A GUI has the potential to give designers a seat at the AI table–something that could be necessary to ensure the technology is used ethically and responsibly.
Cortex launches today from the Austin-based enterprise company CognitiveScale, which has been building AI models for businesses in financial services, healthcare, and e-commerce since 2014. CognitiveScale has been using its own version of Cortex internally to build those models for clients, but launching it to the world means that other companies that employ developers without expertise in machine learning can begin to build AI on their own. While the tool is primarily aimed at companies, not individuals, it presents an opportunity for developers and designers who work at those institutions to get their first taste of creating AI.
Creating the first AI interface
Building this AI graphical interface was no easy task. During the initial conversations with CognitiveScale’s founder CTO Matt Sanchez, who previously ran IBM Watson Labs, Rolston says he had to admit to Sanchez that he and his team were completely lost. It took many hours before the design team could begin to understand and conceptualize what Sanchez was trying to do. “I think good designers can ride shotgun with a surgeon or jet pilot or AI programmer, and listen to them, and extract out of them things that are true to design and are true to their profession,” Rolston says. “That [didn’t] happen without hours of conversations where I [had] barely a thread or grasp on what Matt was saying.”
Machine learning functions by extracting patterns from millions–or even billions–of data points, which enables it to make decisions about new data. It’s conceptually simple to understand, but Rolston and his team had to dive deeper into the technical elements of how AI really works, something that typically takes a PhD to fully comprehend.
Their conversations started with trying to create basic terminology for different elements that the Cortex composition tool would have. Rolston likened the process to programming during his teenage years, in the mid-1980s, before terms like “file” and “folder” were ubiquitous. These terms are tied to the development of the graphical user interface, which ended the era of only communicating with a computer through code and instead offered a radical alternative: a visual representation on the screen that gave you shorthand to accomplish different tasks. “All those things are the nerdy cruft of creating computer software that’s been worked out over a very long time,” he says. “Back in ’85 there was no one way to do it. Looking at this modern situation, there was no one way to do it.”
Rolston and his team found that the CognitiveScale developers were using different words to refer to different parts of the system, so they had to get on the same page. They ended up deciding on two primary terms: skill and agent. Skills are single-purpose bits of software that can be packaged up and used again and again–kind of like Amazon Alexa skills. Agents, which are composed of skills, are the larger, more complex models that you build inside of Cortex–they could accomplish tasks like processing insurance claims using text analysis, or tracking investor sentiment in a particular industry. This nesting concept forms the core of how Cortex functions.
Once these terms were pinned down, Rolston and his team had to figure how to represent them visually. The team could have done more of a “log” form, similar to Facebook, where you scroll down through content, or a windowed view, like file folders or Google drive. But Rolston realized that the key thing the interface needed to provide was a way for the developer to see what was happening under the hood, without having to trace it through every single line of code.
So instead of using a simple list of objects to enable that level of traceability, Rolston decided on a honeycomb structure because it lets you organize the model visually in a way that makes the most sense to the user. Within the honeycomb interface, skills become bubbles that can be moved around the screen based on the way the system’s designer thinks. “Just like a desktop, I can emphasize that sense of logic in my own mind by the placement of these things,” Rolston says. “I can move the bubbles in the honeycomb to focus on them or see them a certain way. If the first processor is not as important to me as the third one, place it off to the side.”
The result is a simple honeycomb-like switchboard where you drag and drop bubble-like skills; blue and green lines show the flow of data as it moves into and out of each skill. Conceptually, it feels like Garageband, but for AI.
For Rolston, this is the first step toward something like Squarespace, where someone with no coding experience can create a simple website. “Those are the highest order examples of programming,” he says. But in essence, he points out, these are “simple tools that got us closer to the problem than the tool”–meaning they remove the layers of technical expertise necessary to code the thing itself, and allow you to instead focus on the problem you want to solve. Rolston imagines that Cortex could act like that first step toward making AI more of a tool to solve a problem, rather than a hugely complicated thing to do in itself.
Cortex’s interface has its own kind of aesthetic vision, as Rolston is aware of the kind of precedent he could be setting with Cortex’s design. “We looked at the problem and tried to make it clearly beautiful and ownable so the uniqueness of the problem you’re solving isn’t lost on you when you’re looking at the interface,” he says. “This isn’t a C++. This is accessible for designers, people who are focused on an aesthetic goal. If you wear something elegant you will feel more elegant, you’ll behave more elegant. We try to bring the same idea to the tool set.”
An App Store For AI
Cortex’s composition tool is only one part of the system, which offers a full suite of business-focused analytics software as well. CognitiveScale is selling the tool through a software-as-a-service business model targeted at companies, which each would pay a one-time set-up fee as well as a monthly or yearly subscription based on the company’s size. The other key element of the product is a marketplace, where people who build little bits of code will be able to package them up as skills so that anyone else can use them–for instance, if you build an image classification algorithm, you could upload that as a skill in Cortex’s marketplace, where anyone could use it. Many of these will initially come from CognitiveScale, but the system’s users will also be able to make and upload their own skills.
Jon Richter, CognitiveScale’s head of product management, explains that users will be able to take the same text classification skill to process invoices or client complaints or healthcare claims. It’s the power of the App Store, as applied to AI.
The architects of Cortex believe that because the system has mechanisms for you to track how the functioning works in a real-life business situation, that will make it easier to build AI responsibly. That remains to be seen. One of the benefits of having highly trained scientists building AI is that they have expertise in data, and they may be better equipped to address issues of bias than your average developer. Greater accessibility for non-experts also means that developers without specialized training are directly building AI technology as society grapples with the negative implications of pervasive machine learning algorithms.
While Cortex makes AI easier to implement for businesses, it also gives designers a chance to start playing around with models in a way that doesn’t require as much education and expertise as it takes to create a machine learning algorithm from scratch. And ideally, Cortex could help designers evolve toward becoming more data-centric in their work–and designers may be able to provide a human-first mind-set to the development of the technology. Rolston says that one of the designers on his team who’s not at all a programmer but knows the Cortex tool conceptually was able to make a simple text sentiment analyzer–“a quick and dirty AI,” as Rolston called it.
“The new designer will be a data designer,” Rolston says. “This is the next key step to that idea.”