A robot is beating human pros at table tennis. Its maker calls it a milestone for machines

A paddle-wielding robot is so adept at playing table tennis that it poses a daunting challenge to elite human players and sometimes beats them, according to a new study that shows how advances in artificial intelligence are making robots more agile.
Japanese electronics giant Sony built the robotic arm it calls Ace and pitted it against professional athletes. Ace proved a worthy adversary, although he possessed some non-human attributes: nine camera eyes positioned around the field and an uncanny ability to track the ball’s logo to measure its spin.
The robot learned to play the sport using the AI method known as reinforcement learning.
“There is no way to manually program a robot to play table tennis. You have to learn to play through experience,” said Peter Dürr, an artificial intelligence researcher at Sony who co-authored the study published Wednesday in the scientific journal Nature.
To conduct the experiments, Sony built an Olympic-sized table tennis court at its Tokyo headquarters to give professional and other highly skilled athletes a “level playing field” with the robot, Dürr said in an interview with The Associated Press. Some athletes said they were surprised by Ace’s prowess.
Sony says this is “the first time a robot has been able to perform at a human, expert level in a competitive sport commonly played in the physical world – a long-standing milestone for AI and robotics research.”
The custom-built robot has eight joints that direct its movements, or degrees of freedom, allowing it to position the racket, execute shots and respond quickly to its opponent’s exchanges.
“Speed is really one of the fundamental problems in robotics today, especially in scenarios or environments that are not fixed,” Michael Spranger, president of Sony AI, said in an interview.
“We see a lot of robots in factories that are very, very fast,” Spranger said. “But they always follow the same trajectory. With this technology, we show that it is actually possible to train robots to be highly adaptive, competitive and fast in uncertain and constantly changing environments.”
Spranger said such technology could play a role in manufacturing and other industries. It’s also not difficult to imagine how such rapid and extremely insightful hardware could be used in war.
A humanoid robot ran faster than the human world record during a half-marathon for robots in Beijing on Sunday, but getting a machine to interact and compete at split-second speeds with trained human athletes is, in some ways, a more difficult challenge.
Spranger said it was important that researchers not give the robot too unfair an advantage and that its speed, arm reach and performance were comparable to that of a trained athlete who trains at least 20 hours a week. It is played according to the official rules of table tennis on a typical sized court.
“It’s very easy to build a superhuman table tennis robot,” Spranger said. “You’re building a machine that sucks up the ball and shoots it much faster than a human can return it. But that’s not the goal here. The goal is to have some level of comparability, some level of fairness to the human, and to really win at the AI level, at the decision-making level, at the tactic level, and to some extent, at the skill level.”
This means, he says, that “the robot can’t just win by hitting the ball faster than any human could, but it has to win by actually playing the game.”
AI researchers have long used board games like chess as benchmarks for a computer’s capabilities. They then turned to more open video game worlds. But moving AI from simulated environments to the physical world has long been the gold standard for robot makers.
Last year marked a “kind of ChatGPT moment for robotics,” Spranger said, with new AI-based approaches to teaching robots about their real-world environments and giving them physically demanding activities, like backflips.
Sony is not the first to tackle robots in table tennis. John Billingsley helped pioneer such contests in 1983 in an article titled “Robot Ping-Pong.” More recently, Google’s AI research division, DeepMind, has also taken on the sport.
And while impressive, Billingsley said Sony’s computer vision and motion-sensing capabilities make it difficult for a two-eyed human to stand a chance.
“I wouldn’t want to downplay this feat, but they did it by hand and used hammer techniques,” Billingsley, a retired professor of mechatronics at the University of South Queensland in Australia, said in an email to the AP.
He added, however, that it adds to the lesson that “real progress comes from competitions, whether it’s hitting a ball or setting foot on Mars.”
Japanese professional gamers Minami Ando and Kakeru Sone were among those taking on Sony’s robot. Two referees from the Japan Table Tennis Association judged the matches.
After submitting the paper for peer review before publication in Nature, the Sony researchers continued to experiment and reported that Ace accelerated his shooting speeds and rallies and played even more aggressively and closer to the edge of the table. Competing against four highly skilled players, Sony said Ace beat all but one of them in December.
Another expert player, Kinjiro Nakamura, who competed in the 1992 Barcelona Olympics, told researchers after watching Ace play a shot that “no one else would have been able to do that. I didn’t think it was possible.”
But the robot having now done it “means there is a possibility that a human could do it too”, he said in remarks published in the journal Nature.
___
AP journalists Yuri Kageyama and Javier Arciga contributed to this report.



