ChatGPT, the AI chatbot from OpenAI, burst into public consciousness, reaching 100 million active monthly users within just two months of launch. Conversely, modern robotics has developed more quietly, steadily chipping away at the challenge of physical machines operating safely and productively in real-world environments.
This incremental approach has brought robotics a long way. The International Federation of Robotics estimates a global operational stock of 3.9 million industrial robots in 2023, with global installations reaching 700,000 units per year as soon as 2026—and these figures only include industrial robots with robot arms. Including non-manipulating mobile robots, such as Amazon’s 520,000 robotic drive units, would make this number far bigger.
Yet, the technology remains below the radar for most senior leaders. Robotics automation can still seem out of reach—too complex, costly, or constrained in how you can use it. ChatGPT’s explosive rise underscores the power of user-friendly interfaces in democratizing complex technologies, and robotics developers need to take note.
MODERN ROBOTICS: THE TORTOISE APPROACH
Most people know the slow, steady tortoise is supposed to win the race, and robotics certainly seems to have taken this parable to heart. Since the first modern industrial robot was installed on a General Motors production line in 1961, developers have expanded robot capabilities immeasurably over time.
The latest state-of-the-art models, including quadrupeds, wheeled robots, drones, exoskeletons, and robot arms, are impressive feats of physical and software engineering. Advanced vision systems of cameras, sensors, and AI algorithms enable them to navigate autonomously in complex environments. The latest motors, actuators, and end effectors (the tools that act as a robot’s hands or fingers) allow them to quickly interact with objects in a precise and delicate way.
Businesses across industries are using robotic automation to improve productivity, cut production costs, and minimize risks for employees tasked with hazardous jobs. However, there are still barriers to robotic adoption.
Getting hands-on experience with robots showed me one of the barriers: working with robotics systems can be hard.
THE CHALLENGE OF WORKING WITH ROBOTS
EY’s Robotics Center of Excellence has been experimenting with an advanced robot arm. It is a benchtop model designed for applications such as loading parts into a machine, dispensing glue, or boxing up machine parts. It has a very modest 128mm footprint, 500mm reach, and 3kg payload, and it can move objects to a 0.01mm precision tolerance.
Once installed and programmed, the arm performed well at my proof-of-concept task, which involved assembling structures out of Legos. The robot had to identify a piece and connect it to others to form a predefined structure, and then disassemble the structure, placing the blocks back in a neat pile.
It may seem simple, but it wasn’t straightforward. Before installing or programming anything, it was important to determine the necessary components, from the type of “hand” to the grip material for the fingertips and the sensors to identify the Legos. When you buy a robot arm, you receive the robot arm. Nothing else. You either use a system integrator to assess your use case and plan and install the components, or you build your system component by component.
The journey of purchasing and setting up a robotic arm today parallels the early days of personal home computing. Acquiring a personal computer during the 1970s was anything but “out of the box.” For instance, the Altair 8800 was designed to be assembled by hobbyists and professionals, requiring considerable work and skill to put together. These early computers lacked integrated peripherals such as a keyboard or display, similar to how a robotic arm comes without a hand or sensors.
The immaturity of the market and the need for bespoke applications can be a barrier. It sometimes leads DIYers as far as 3D printing individual components. I even found myself in this position at one point. When the hard rubber pads forming the fingertips of the gripper couldn’t hold the Lego bricks, 3D-printed replacement pads made from a more flexible material fixed the issue.
LEARNING FROM CHATGPT
ChatGPT has demonstrated the power of user experience to transform access to complex technologies. Robotics developers are often more focused on the software side of this experience. Understandably so—the IFR noted in a recent report that robot programming and integration represent 50-70% of the cost of a robot application.
Working with the robot arm gives me confidence that developers are making good progress here. The user interface for programming the robot’s movements was intuitive and flexible. It was possible to manually manipulate the robot into the correct position to record a waypoint or use on-screen controls to move it into position.
Advancing AI language models will simplify programming even further. Large language models will allow users to tell an AI robot what to do in natural language. “Embodied language models” are another emerging type of AI model that enables robots to act on high-level user instructions by integrating sensor data with words and code. NVIDIA, for example, recently introduced an AI agent named Eureka, empowered by GPT-4, to autonomously instruct robots in complex tasks without necessitating task-specific prompts or predefined reward templates, marking a significant step toward more intuitive human-robot interactions.
In the future, these AI robot brains could mean robot arm testers simply say, “Program the arm to take a glass from the dishwasher tray and place it upright under the water dispenser.” But these transformational advances rely on hardware systems as well as software. To ensure wider robotic adoption, it’s crucial to ensure both aspects of the user experience are equally strong.
LIGHTING THE FIRE
Robotics’ steady, under-the-radar progress is commendable, but it’s insufficient to capture the attention and investment it deserves. Robotics developers need to prioritize user experience, simplifying the deployment and operation of robots to shatter one of the biggest barriers shutting smaller enterprises out of the benefits of robotic automation. Like ChatGPT, the right user-centric approach can skyrocket robotics from a niche spectacle to an everyday utility.
Jeff Wong is the Global Chief Innovation Officer of Ernst & Young, one of the largest professional services organizations in the world.