“At Vaga Lib Ball, we are now using cameras with computer vision technology to track not only athletes but also the ball,” says Alan J. Brist, head of Omega Timing. “So this is the combination where we use camera technology and artificial intelligence to do this.”

Omega Timing’s R&D department includes 180 engineers, and according to Zubrist, the development process began in 2012 with home positioning systems and motion sensor systems. The goal was to reach a point where events for multiple sports in more than 500 sports that it works on each year could provide detailed live data on Omega Athlete Performance. It also takes less than a tenth of a second to measure, process, and transmit that data so that the information matches the people on the screen.

With Beach Valey Lab ball, this means taking this position and motion technique and training AI to identify a number of shot types – from smash to spikes and its variations – and pass types, as well as the flight path of the ball, then combine this data with information. . Players’ clothing is cut with a gyroscope sensor. This motion sensor system allows the athlete to know the direction of motion as well as the list of jumps, motions, etc. Once this is done, all of this is given to live broadcasters for use in commentary or on-screen non-screen graphics.

According to Zobrist, a hard lesson for AIA to learn, he was looking for the ball accurately when the camera could not see in the game. “Sometimes, it is covered by the body part of the athlete. Sometimes it’s out of the TV frame, ”he says. “So, it was a challenge to track him down when you lost him. To predict where the ball goes, and then, when it reappears, close the gap again when it lost and regained the object, and fill [missing] Data and then continue automatically. That was the biggest issue. “

Tracking the ball is crucial for AI to determine what is happening during the game. “When you can track a ball, you know where it was located and when it changed direction. And with the addition of sensors on athletes, the algorithm will then recognize the shot, ”says Zobrist. “Then whether it is a block or a demolition. You will know which team and which player it was. So it is this combination of these two techniques that allows us to be accurate in the measurement of data. “

Omega Timing claims that its Beach Levy Lab system is 99 percent accurate, thanks to sensors and multiple cameras running at 250 frames per second. Toby Bracken, a professor of computer vision and image processing at the University of Durham, however, is interested to see if this arises during sports – and, if, decisively, if the system is fooled by race and gender differences.

“What has been done is reasonably impressive. And you will need to set up a big data to train AI on all the different moves, ”says Bracken. “But one thing is accuracy. How often does it go wrong in terms of those different moves? How often does he lose the ball track? And if it works equally for all races and genders. Is it 99 percent accuracy, according to the USA women’s team? Others Ghana’s women’s team has 99 percent accuracy? “

Zobrist is confident, and explains that while it may be easier to call on Google or IBM to provide the necessary AI skills, this was not an option for Omega. “The crucial thing, whether it’s for a scoring game, or a timing sport, is that we can’t have a discrepancy between the performance explanation and the end result,” he says. So to protect the integrity of the result, we cannot rely on another company. We need to have the skills to be able to explain the outcome and how the athletes got there. “

Speaking of future times and trekking upgrades, Zobrist is tight-lipped, but says the 2024 Paris Games will be a major one. “You will see a whole new set of innovations. Of course, it will be around timekeeping, scoring and motion sensors and positioning systems. And certainly also Los Angeles in 2028. We’ve got some really interesting projects out there that we’ve really just started. “


More great wire stories