One of the first things you notice about Baxter–besides the fact that it has a face–is the way other factory employees talk about it as if it’s just another factory worker. And in a sense, it is. Not only was Baxter designed to work safely alongside people on the floor of the factory, but it has human-like qualities intended to comfort the people working with and alongside it. The result is a personality akin to an enlightened 8-year-old: Simple and self-effacing.
Unlike the other manufacturing robots Lowell Allen has seen in his 40 years at Rodon, Baxter was designed to be two things other robots aren’t: trainable and aware. It’s designed to work alongside human workers, who will be able to teach it to complete new tasks on the fly. Over time, the robot’s capabilities will be expanded via software updates, not unlike a smartphone. Indeed, Baxter was barely up and running on the factory floor before Allen started dreaming up new jobs for it. It’s not every day you see a grown man this excited about little pieces of plastic. As he places each one down on the conference room table, Allen is practically giddy.
“See?” he says, pointing at a disassembled household window lock. “We’d love for him to be able to put these together.”
The window locks are currently assembled by employees of the Rodon Group, a plastic injection molding factory in Hatfield, Pennsylvania where Allen serves as SVP of Manufacturing. Snapping these plastic parts together is a tedious job for people to do, so he’d prefer to automate it. The window locks are just the latest item on Allen’s ongoing wishlist.
That wishlist gets sent to the woman in charge of Baxter’s personality and his skillset, Noelle Dye, who is director of design and UX at Rethink Robotics a few hundred miles north in Boston. The day I visit, Dye is making a rare on-site visit at the Rodon Group to check in on Baxter, which the company bought from Rethink a few months ago. (You can see more about Baxter’s anatomy here.)
On the floor of the factory, Baxter is hard to miss. Amidst a sea of deafeningly loud and enormous machines stands a bright red humanoid, moving its arms in fluid, effortless motions. A factory worker grabs Baxter’s arm, pulls it toward a conveyor belt covered in plastic widgets, then presses a button on its wrist, telling Baxter to grab a widget. The tech then pulls the arm around to the other side of the belt, where an empty box awaits. The button is pressed again, and Baxter lets go. This is how it learns.
“Baxter’s smart like a child,” Dye says, “but he has no idea what his world is all about. You have to tell it, just like you do a child.” They took the mother-child relationship as a paradigm during Baxter’s development. “It was fascinating to hear how moms teach kids–they show, they don’t instruct with words.”
Baxter is a precocious pupil, rapidly repeating the same motions, picking widgets up from the conveyor belt and placing them into a box one by one until there are none left. Then, it frowns. The technician shows Baxter a new task to complete, and the robot regains his more genial expression. Watching this process, you realize that Baxter doesn’t just resemble a human in its physical presence, but also with his “presence,” in the most holistic sense of the word.
Humanoid robots are an old trick, but Baxter means more: He’s a new paradigm in automation that may (hopefully) find its way to other sorts of allegedly “smart” devices. “In traditional automation, you would have to think through all of those [actions] yourself and put every single [action] into the robot,” says Dye. “We made it so that you push one button and it learns all of it at once. Press the button to say ‘pick this up here,’ and Baxter learns a whole bunch of things about that moment.”
It works like the “record” function that has been a part of AppleScript for over a decade, just far more humane. When you touch Baxter’s arm, the cartoonish eyes on its LCD screen face look towards you. (Dye deliberated over how to keep this facial feature simple enough to avoid the uncanny valley, drawing and redrawing Baxter’s eyes until they found the right amount of realism.) The eyes know where you are thanks to the sonor system and front-facing camera installed atop Baxter’s flatscreen cranium. Other sensors, and a total of five cameras, help Baxter understand its surroundings, from people nearby to objects on a conveyor belt.
The joints in both of Baxter’s arms have the ability to sense outside forces and adjust the arm’s own movements accordingly. If it bumps into you, Baxter can feel it. If it happens again, it learns that you’re standing there and will henceforth go around you. The LCD screen on its face–which doubles as a touch screen control panel for those training it–lets you know its sorry.
Baxter’s LCD screen and the Linux-based software that runs on it are central components of the UI, but they’re really just the beginning. Indeed, Baxter’s entire torso is an interface, with environmentally aware moving parts, buttons, facial expressions, and even physical feedback provided by its mechanical limbs. All of that needed to be designed to make interacting with Baxter natural for humans without ever being confusing or dangerous. As with the animated eyes, being in Baxter’s physical presence has to be free of eeriness. “This is a machine,” Dye says. “People want to feel like it’s still a machine.”
Long before any sketches or prototypes were made, Dye spent considerable time observing people doing their jobs in the real world. She was fascinated to find that many people, when demonstrating how their jobs were done, would grab her hands and physically show her. She also interviewed mothers, who tend to teach their children in a similar fashion. A year-long design process began, during which Dye and a team of industrial designers and user experience experts set out to craft not just a robot, but an overall “robot experience.” Normally, a manufacturing bot needs to be programmed by a technician just to do one single, redundant task. By contrast, Baxter would need to be trainable by any factory employee to do any number of things, so it needed something its brethren in, say, a car factory doesn’t: a user interface.
A total of eight designers had a hand in making Baxter look and act the way it does. Once the initial design concepts were complete, the team started building 3-D prototypes using blue foam and wire cutters. The screen-based interface was mocked up using paper prototypes, which had to be tested by people with particularly vivid imaginations. Before long, functional prototypes were built and brought into real-life manufacturing environments for user experience testing. “We made sure that the interactive pieces have as few buttons as possible,” explains Dye, “just like [an] iPhone and our iPad and everything else.”
But making a sophisticated robot worker with just a few buttons proved devilishly hard, and as Baxter evolved from concept sketches to a physical entity, the technical challenges began to pile up. “Being a software guy I never even thought about this, but running cables down to the gripper down at the end of the arm, means that you have to design cables that can withstand flexing and moving millions of flexes,” says Bruce Blumberg, Lead Software Engineer at Rethink. Those moving parts need to not only be durable, but precise. As its arms move, Baxter is constantly monitoring the amount of force being used by each of its joints, Blumberg explains. It must also be aware of any force being applied externally, for example by a human worker grabbing its arm. When that happens, Baxter’s limbs go into what Blumberg calls “zero force” mode, effectively releasing all tension and letting the user take control. Seemingly small details like this can turn into formidable engineering challenges.
Then there were bigger challenges, like getting Baxter to see. “Computer vision is as much an art as it is a science,” says Blumberg. “The ability to be able to have a vision system that can recognize parts, be able to move the arm, to pick those parts up in pretty arbitrary lighting conditions is a big technical challenge.”
Many of the technical limitations of building Baxter were dictated by a need to maintain its disruptive price point. At $22,000, Baxter is dramatically more affordable than other manufacturing robots, both in terms of upfront costs and ongoing cost of ownership. To keep Baxter’s price tag modest, the Rethink team had to eschew certain features, such as sealing it to keep it safe from contamination in paint-heavy environments. That limits what Baxter can do, but makes it available to more businesses.
Baxter has only been in use at the Rodon Group for a few months, but it’s already keeping busy. These days, it helps pack the K’Nex building block toys for which Rodon Group is best known. Unlike a human worker, Baxter could be set up to pack toys for hours on end, even overnight. To Lowell Allen, this is just the beginning. “There’s another dozen jobs here we haven’t thought of that we could use Baxter for,” Allen says. “We’re looking at adapting him to pick out defective parts form an assortment of parts, perhaps down the road, with Rethink’s help.” In the future, other tasks Baxter could be responsible for include separating parts by color, assisting larger servo robots in various manufacturing tasks and even assembling multi-part objects, freeing up humans to do more complex and interesting tasks.
Allen reassures me that Baxter isn’t here to steal people’s jobs. “We’ve always ended up creating jobs with this kind of a thing; they don’t necessarily replace us,” he says. “They replace really brutal, mundane jobs that humans really don’t want to do long-term.” And all with a cartoonish grin.