OBS produces its video content in ultra-high definition and high dynamic range, which should improve the level of detail and color in every shot. Content is also captured in all sorts of formats: vertical video for viewing clips on phones, 8K video for full-definition broadcasts, and 360-degree shots for truly immersive drama.
The OBS says it has more than doubled the multi-camera systems Exarchos also uses cinematic camera lenses, which are capable of capturing more artistic shots, such as enhanced depth-of-field shifts that you’ve likely seen in movies. The difficulty in making that happen with traditional sports cameras is that the time it typically takes to process those complex shots has prevented them from being used in live production. But OBS is relying on AI and cloud technologies to speed up processing time enough to use these shots within its live coverage. Exarchos says its new processes enable shots that were previously impossible to capture and present live, such as 360-degree replays that spin the viewer around the athlete as they fly through the air.
“The effect that exists in the Matrix “The movies you can make in theaters you can make live,” says Exarchos.
OBS is also recording sound in 5.1.4 format, with the aim of capturing immersive audio from stadiums during events and during interviews with athletes in the stands. That, along with elements like augmented reality stations that give people a glimpse of what the Olympic stadium looks like, is intended to make those at home feel closer to the games.
“If we repeat the previous games, which were very successful, we will have failed,” says Exarchos. “Because, as in sports, it is all about breaking new ground, opening new frontiers and taking another step.”
Technology testing ground
As expected in 2024, artificial intelligence tools will be widely used during the Olympic Games.
Broadcasters like Olympic Broadcasting Service and NBC will use AI to pull together highlights from thousands of hours of footage to find the key moments, package them elegantly, and deliver them directly to the viewer. Some companies have bet on AI; NBC will use the voice of legendary broadcasters Sportscaster Al Michaels to narrate his highlights on Peacock. The team trained its generative AI voice engine using Michaels’ previous appearances on broadcast television, and the results sound fluid yet unmistakably uncanny.
As you watch live games, AI will be able to pull up key information in real time and display it on screen: statistics about athletes, percentages of probability of them hitting the shot or overrunning the clock, and artificially augmented views of what’s happening on the field. AI’s foray extends beyond the games; NBC is Incorporating AI on its advertising platform, with the goal of better personalizing the ads that play during breaks.
This extravagant broadcasting bacchanal remains a training ground for these new technologies. NBC is using the Olympics as the first major test of its Multiview capability and user customization features, so expect to see these features appear more frequently in regular live sports broadcasts. According to an NBC representative, the company’s hope is that the technology debuting during the Paris Olympics can be implemented at other live sporting events, and even non-sporting spectacles like the Macy’s Thanksgiving Day Parade.
Ultimately, Exarchos says, the goal of these technologies is to make people feel more connected to these events and the people who participate in them, especially after the last two Olympics were bogged down by pandemic restrictions that limited who could attend.
“We are going through a period where people have a huge desire and nostalgia to relive physical experiences, especially with other people,” says Exarchos. “Sport is a great catalyst for that.”