Exclusive: Inside Autodesk’s Robotics Lab Of The Future

Our exclusive behind-the-scenes look at how Autodesk almost didn’t see the groundbreaking innovation right under their noses.

In the middle of a tall, wide, two-story workspace with black walls and a conference room built inside an orange shipping container, three industrial robots are sitting idle.


Two of them are identical, yellow FANUCs, from Japan, the kind typically used in car manufacturing or for “pick-and-place” tasks. They’re called Castor and Pollux, after twin brothers from Greek and Roman mythology. Here, inside one of the many nearly identical maritime buildings that line San Francisco’s downtown waterfront, without a car to build or a warehouse full of items to pluck off shelves, they look a little out of place. As does their nearby cousin, an even bigger, orange, Swiss industrial robot from ABB that’s as yet unchristened, and a smaller device, a Universal Robotics UR10 called Bishop.

This is the Pier 9 Workshop, design software giant Autodesk’s private maker space, 27,000 square feet of wood shop, metal shop, 3-D printers, electronics, and even a commercial kitchen.

It’s also the home of Autodesk’s Applied Research Lab and, as Fast Company is the first to reveal, the company’s new Robotics Lab, which had been under wraps since February. Castor, Pollux, Bishop, and their Swiss cousin aren’t out of place at all.


With customers in nearly every major industry you can imagine, Autodesk was certainly well aware of how robots had become a mature business technology over the last several decades. It just hadn’t thought much about how robots will work in conjunction with software, and people, in the future. Their thinking changed about a year ago.

For Maurice Conti and his colleagues, the light-bulb moment came when they saw the technology behind the Oscar-winning special effects in Gravity.

Blind Spots

Conti is the head of Autodesk’s Applied Research Lab, and the director of strategic innovation in the company’s Office of the CTO–OCTO, which explains the Octopus on the back of his business card. He’s got dark hair, an easy smile, and a salt-and-pepper goatee, and last month, sitting inside Pier 9 next to a unicorn skull and a pink R2 unit (as in Star Wars‘ R2-D2) that had starred in the viral video “Artoo in Love”, Conti recalled the lab’s origins.


Autodesk built an $11 billion business by selling design software used in countless companies, large and small. From one-person architecture shops to multinational construction firms, from Hollywood visual-effects studios to oceanographic institutes, Autodesk’s tools–Maya, AutoCAD, 3ds Max, Fusion 360, and many others–are industry-standard products.

The Applied Research Lab has a unique, enviable, but hard to pin down mandate: identifying what Autodesk doesn’t yet know, but will need to know in five years. Or a hundred.

CTO Jeff Kowalski “said, ‘I need you to go look in our blind spots,’” Conti told me. “And I said, ‘Oh, by definition you can’t tell me where to explore.’”



In 2013, Conti’s team gave the big boss, Autodesk CEO Carl Bass, a presentation on the future of manufacturing. It highlighted four technologies: Additive, such as 3-D printing; Subtractive, such as computer-controlled cutting machines called CNCs; Bio/Nano, such as synthetic biology; and Robotics.

Autodesk was already all over the first three, having invested years and fortunes into developing businesses around them. Not so with Robotics. “We [had] nothing really going on [there],” Conti said. “No strategy, no vision, no point of view. This [was] a real problem.”


Around that same time, Conti’s team visited Bot & Dolly, a robotics startup Google bought in 2013, that had created a system for controlling industrial robot arms, like those used in car manufacturing, for filmmakers.

If you watched Gravity, the Sandra Bullock film about an astronaut caught up in a disaster on the International Space Station, you’ve seen Bot & Dolly’s Academy Award-winning tech in action. Their intent is to use cameras mounted on those robot arms to shoot live-action with the same single pixel precision Pixar employs when making its computer-generated movies.

“We went to Bot & Dolly and I went ‘Oh, shit,’” Conti said. “I literally went, ‘Oh, shit. We are not paying attention to [robotics] the way we should be.’”


That was ironic, since Bot & Dolly used Autodesk’s software to program its robots. It’s just that no one at Autodesk had ever imagined such a thing.

“The robotics platform behind Gravity is all ours,” Conti said, “and we didn’t even know it, because it was never intended to be used that way.”

Robotics, in other words, was a blind spot. It was time for some vision.


The Autodesk Robotics Lab

Walking into Autodesk’s lab, the only notable activity is from Bishop, which is a type of device called a collaborative industrial robot arm, that from afar at least, appears to be doing little more than drawing primitive shapes on paper.

That makes sense, though. After all, the facility is still new, and Conti and his team still pretty much don’t know what they don’t know about these machines.

One thing Conti does know is that in traditional industrial settings, big manufacturing robots are scary as hell to the corporate suits. They’re big, they move fast, and that’s why they’re usually kept inside cages, or laser curtains, where they do their jobs and people aren’t allowed in.


They’re “dangerous,” Conti said. They can “smear you all over the wall, and you wouldn’t know it was doing it” until it was too late.

When Conti’s team decides it’s time to explore a new area, like robotics, that means “asking a lot of questions, and…making a lot of friends who are smarter than us,” he said.

After plenty of those conversations, including at Bot & Dolly, Conti thought that one direction to pursue might be figuring out how to collaborate more with robots–to make it possible for people to work side by side with industrial robots without being smeared against the wall.


One idea is programming the robots to watch people and learn the tasks the humans are doing.

“That could [lead to] a more creative, fluid interchange between robots and humans,” said David Thomasson, a principal research engineer, and a member of the five-person Robotics Lab team. “There’s a robot watching a craftsman, for example, carving wood. And it’s learning the types of cuts you prefer, and it can come in and repeat them, or make variations of what you do, in your style, so you can both be working on a job together.”

As Thomasson puts it, humans are good at sensing and problem solving, while robots are good at strength and repetitive actions. With enough software expertise, robots can learn finesse too.


Teaching the machines to learn from the people means applying computer vision as well as general sensing technologies to the problem so that the robots can grasp how people move and reproduce their motions.

“The more we can come up with software that can learn and perceive its environment,” Thomasson said, “the more capable the hardware’s going to be. That’s a big reason we see [Autodesk has] a place in robotics as a company, because more and more, it’s a software problem.”

Moving Beyond CAD

For a company that has made huge amounts of money from computer-aided design software, you’d think it would be sacrosanct. Maybe not, Thomasson suggests. He explains that one “internally disruptive” robotics project is actually imagining a future without CAD software.


The idea is that robots could learn a visual language for cutting shapes far more precisely than humans could ever do by hand.

“If I could put a bit of plywood on a workbench, and with pencil marks indicate what I want cut out,” Thomasson said, “the robot watches, and then moves in and cuts what I want.”

Thomasson shows me that he can draw corner marks on a piece of paper, and Bishop responds by drawing a rectangle using those corners. Right now, all it can do is draw, but that’s an early proxy for a laser cutter.


Like many things at Autodesk, the motivation for this particular project was Bass, the CEO, a man well-known inside the company for his DIY skills.

Bass had asked why a robot couldn’t cut a shape drawn by a human. “‘Why should I have to jump on a computer to do that,’” Thomasson recalled Bass asking. “‘Why can’t I just gesture it?’ That became my mission for the next few weeks. That’s one of the great things about having a maker as a CEO.”

Printing A Bridge

It’s still a little too early to know all the ways Conti’s team plans on using its robots.

One way will be working in parallel with a project in Amsterdam to 3-D-print a bridge. That project will involve an industrial robot printing the bridge in stainless steel “in mid-air,” Conti explained.

The technology doesn’t quite exist yet to pull that project off, so as it proceeds, and as the Amsterdam team at the design firm MX3D develop their code, they can send it back to Autodesk, where “we can tweak it, and send it back,” Conti said.

At the same time, the Robotics Lab team is working on new ways to control its industrial robots. One big goal, Conti said, is to get Castor and Pollux to collaborate in real time with high precision.

In part, then, that means coming up with tasks the two robots can do together. Conti said he’s not sure yet what those jobs would be, but one could be projection mapping, a technique for using software to project imagery on irregular shapes like buildings, cars, or even shoes.

Among Autodesk’s inspirations for getting into projection mapping is the Bot & Dolly short film “Box,” which showcases a stunning use of software and precisely controlled robots.

Successful Technology Transfer

Although the Applied Research Lab generally doesn’t worry about having to quickly productize its work, that doesn’t mean that it never does.

When the team finds approaches with commercial potential, it brings in Autodesk product leads as early in the process as possible so that everyone shares the resulting insights.

“If I were to go to the head of [an Autodesk product] division with a binder of stuff, saying, ‘You guys should be working on robotics,’ it’ll never happen,” said Conti. “It’s not the hand-off. It’s not, ‘We’ve written a bunch of code, now you guys go build a product.’ It’s, ‘Yeah, it’s interesting, let’s explore it together.’”

That, Conti added, “is when we let go.”

That’s already happened to a certain extent with robotics, as the team has come up with all-new methods for controlling the robots using Dynamo, an Autodesk visual programming language. They’ve also developed ways to use Fusion 360, the company’s mechanical design and engineering platform, to work with the robots.

Together, those two projects have already made the Robotics Lab’s work “one of the most successful technology transfers we’ve done,” Conti said, “because it happened very, very quickly.”

That hardly means the lab’s work is done. It just means that maybe Conti and his colleagues will have to look a little further into the future to see the next blind spot. And they’ll have to make more smart friends.

“We will continue to do robotics,” Conti said, “but it may look very different than it does today. If I could tell you [what it will look like], then we wouldn’t be looking far enough out.”


About the author

Daniel Terdiman is a San Francisco-based technology journalist with nearly 20 years of experience. A veteran of CNET and VentureBeat, Daniel has also written for Wired, The New York Times, Time, and many other publications