First they came for chess, now they come for video games. In the 21st century, a legion of AI competitors could very well dominate every board, screen, and console played by humans. Famous machines like Deep Blue and Alpha Go have already conquered some of the world’s most complex strategy puzzles.
Next up, the PS5.
Japanese tech giant Sony revealed Wednesday that it has trained the toughest-ever opponent for the race-car simulator Gran Turismo—a champion that can beat top-class esports drivers at their own games. Forged on the battlegrounds of over 1,000 PlayStation 4 consoles, the AI racer-bot has grown smart enough to identify optimal course routes and can execute skilled tactical maneuvers to pass or block competitors, even in vehicular scrum. It does so with ruthless effectiveness—while still respecting the human etiquette of the game, Sony claims.
The company published research on its brainchild—dubbed Gran Turismo Sophy—in Nature journal this week. The development process paired “state-of-the-art, model-free, deep reinforcement learning algorithms with mixed-scenario training to learn an integrated control policy that combines exceptional speed with impressive tactics,” it said. “In addition, we construct a reward function that enables the agent to be competitive while adhering to racing’s important, but under-specified, sportsmanship rules.”
In a media-broadcast demonstration, Sophy bested four of the world’s top Gran Turismo drivers in head-to-head contests, proving the tech’s superiority to mere mortals. But Sophy’s aspiration was never to crush humanity’s spirits or to leave it feeling defeated. On the contrary, it was meant to spark fresh excitement in esports, especially among elite players who felt they had no challenge left to answer.
“I feel frustrated, that never happened before battling with an AI,” Tomoaki Yamanaka, one of the four racers, said after the loss. “I drove like I would drive against a human. That’s a really amazing thing.”
In that sense, Sophy pushes the human limit; it can “accelerate and elevate the players’ techniques and creativity to the next level,” said Hiroaki Kitano, CEO of Sony AI, in a statement. The company has said it’s exploring ways to integrate Sophy into future versions of Gran Turismo (the game’s seventh edition is set to launch in March).
Gran Turismo now joins a long list of games in which AIs have beaten people, including shogi, Go, Starcraft, classic Atari video games, and multiplayer series Defense of the Ancients, for which the Microsoft-backed OpenAI created a fighter bot.
But Gran Turismo has higher complexity than other console games, requiring players to balance the physics of friction and aerodynamics, all while making split-second judgment calls and reacting to shifting landscapes with light-speed reflex. And even beyond that, experts say Sophy’s achievements stand out in its capacity to behave aggressively, yet still fairly, and to observe the gamers’ code of conduct beyond just the letter of the law—in other words, to embody the subtle nuances of human character.
While technically legal, Sony didn’t want Sophy to win by bullying other racers off the road. To make sure it wouldn’t, they trained its neural network by levying penalties for collisions with other drivers, for example—using a trial-and-error process referred to as reinforcement learning.
“The agent should be a friend, a comrade, a buddy to human beings, an agent that people can feel sympathy with,” said Kazunori Yamauchi, the creator of Gran Turismo. “Also, the agent can stimulate the emotion of people, so that the agent and human beings can mutually respect each other.”