Chances are the next time you watch video highlights of a fast rally or an elegant backhand from a Wimbledon match, the commentary you hear would have been generated by Artificial Intelligence (AI).
In June, the technology company IBM and the All England Club announced that generative AI technology from IBM watsonx—the company’s enterprise AI and data platform—would produce tennis commentary for all video highlight packages during the Championships this year. That means users will now hear such audio commentary for match highlight videos on the Wimbledon website and smartphone app.
To develop this AI commentary, experts from IBM iX, the experience design partner in IBM Consulting, worked with the All England Club to leverage foundation models from watsonx to train AI in the unique language of tennis, according to a statement from IBM. “Generative AI built on these foundation models was applied to produce narration with varied sentence structure and vocabulary to make the clips informative and engaging.... Its introduction this year is a step towards making commentary available in an exciting way for matches outside of Wimbledon’s Show Courts, which already have live human commentary,” the statement adds.
An IBM Research explainer says foundation models are trained on a broad set of unlabelled data that can be used for different tasks, with minimal fine-tuning.
Another new feature being introduced this year is the IBM AI Draw Analysis: the first statistic of its kind in tennis that will use AI to define how favourable the path to the final might be for each player in the singles draw. Usama Al-Qassab, the All England Club’s marketing and commercial director, said in a press statement that these new features will help fans gain even more insight into the singles draw and access commentary on a wider variety of matches through the match highlight videos.
Wimbledon has always been one of the front-runners when it comes to sporting competitions adopting technology. The competition has partnered with IBM since 1990 on various technological innovations. In 1991, for instance, radar technology was introduced to measure ball speeds. In 2006, the Grand Slam tournament adopted Hawk-Eye, the electronic camera line-calling system that is also popular in cricket as part of the Decision Review System, or DRS.
But while re-evaluating a call or turning around a decision with the help of technology is one thing, describing a passage of play on the tennis court is a completely different ball game.
“I used to enjoy the AI-generated commentary in games like Fifa. These are voices of famous commentators. I also used to wonder if one day someone would record my voice and fine-tune it. Maybe it’s just the start. But I don’t know how much expressions we can hear in AI commentary,” says R.R. Kaushik Varun, a Kolkata-based commentator who has worked on various football competitions in India and done Bangla commentary for select Premier League matches.
Varun says AI-generated commentary could be crisp, with minimal lines, no grammatical errors and proper statistics, but will find it difficult to match the emotion and connect of real-life commentary. “I don’t think so—ever. We see so many videos of famous goal shouts on YouTube. The lines of commentary make it more special. Be it (M. S.) Dhoni hitting a six and Ravi Shastri shouting it out, or Peter Drury describing a game of football. These things make commentary epic,” adds Varun. “I don’t know how much of that AI can do. The expressions and feelings will be missing.... It will only talk based on what it’s fed.”
Tennis isn’t the only sport taking the AI commentary route. In March, IBM also introduced AI-generated spoken commentary for The Masters Tournament, to produce detailed golf narration for more than 20,000 video clips over the course of the tournament on the Masters app and website. The company also introduced hole-by-hole predictions to project a player’s score on each hole for the entire tournament, played out in April.
In March, the sports betting company Oddset, with the help of Stockholm-based communication agency Perfect Fools, used OpenAI’s GPT3 language model to create an immersive audio commentary campaign for football fans called “Dreams of Europe”. As part of this campaign, which involved 32 unique tracks, fans could listen to AI-generated radio commentary, while in bed, on their favourite Swedish football team as it competed in the Swedish Cup. GPT3 was trained on the player data of every team. Overall, the AI model generated simulated commentaries for 192 matches, 2.9 million words and more than 17,000 minutes of audio content.
But will AI-generated commentary ever feel real and natural? A lot could depend on how affective AI—or emotion AI, a subset of AI—and human-machine interaction develop. Many industries—advertising, automotive, assistive services, mental health apps—are already deploying emotion AI, which measures, understands and responds to human emotions.
It remains to be seen if we will ever arrive at a point where cameras can capture a roaring Novak Djokovic on Centre Court and feed that imagery to an AI model system that would convert it into text and audio and present it to the audience in real-time, while matching the excitement and emotion of a human commentator.
Some smaller changes are under way. As an AFP report earlier this week noted, the men’s ATP Tour in April has announced tour-wide adoption of electronic line-calling from 2025, a role traditionally carried out by on-court line judges, in a move to “optimise accuracy and consistency across tournaments”.
Bill Jinks, technology director at the All England Club, told AFP line judges would be part of Wimbledon this year. “In 2023 we’ve definitely got line judges,” Jinks was quoted as saying in the report. “Line-calling technology has changed. We’ve been using the challenge system (players are able to query a limited number of calls, using video technology) since 2007 and it currently works for us.... Who’s to say what might happen in the future?”
Also read: A note on the issue: It’s Wimbledon, it’s special