How Owl AI is improving sports officiating with Gemini to modernize the sports viewing experience

Google Cloud · Beginner ·🧠 Large Language Models ·3h ago
Google for Startups Cloud Program: https://goo.gle/4mRfHZk Contact Our Cloud Startup Team: https://goo.gle/3QNwa4O Featured in this video: Josh Gwyther, CEO, Owl AI Executive summary: AI-sports officiating and analysis solution, Owl AI, is out to improve the viewing experience for professional sports. The company is using Gemini to train AI models on complex, nuanced video to analyze live events. By using multiple instances of Gemini to review footage, Owl AI aggregates data to make more accurate calls. In its first two weeks as a company, Owl AI wrote more than 30,000 lines of code using Gemini Code Assist and enabled the Summer X Games to broadcast in four new languages. This resulted in a 400% increase in international viewership, proving Owl AI can now deliver critical calls in seconds and improve the sports viewing experience. Challenge: Referee call accuracy in professional sports is only becoming more important. As people review critical plays, they can take valuable minutes on complex calls and are susceptible to bias, which drastically slows down the competition and disrupts the viewing experience for fans. AI-sports officiating and analysis solution, Owl AI, needed a way to efficiently train AI models using video technology to speed up and enhance refereeing, judging, and analytics for live sports. The company needed a solution that could be trained in the nuances of a sport to accurately understand complex movements and fluid dynamics in video. Solution: In their search for the right AI solution, Owl AI chose Gemini. Its multimodal capabilities meant Gemini could understand nuances in movement, which is essential for analyzing live events. Instead of trying to get one model to do everything, Owl AI breaks uses multiple instances of Gemini to review the same footage to aggregate the information and make accurate calls. Result: With Gemini, Owl AI was able to leap into production immediately. In just two weeks, the company wrote more than 30,000 li
Watch on YouTube ↗ (saves to browser)
Sign in to unlock AI tutor explanation · ⚡30

Related AI Lessons

How Encoder Transformers Actually Understand Language
Learn how encoder transformers understand language through the evolution of the attention mechanism in encoder-only models like BERT and ModernBERT.
Medium · AI
How Encoder Transformers Actually Understand Language
Learn how encoder transformers understand language through the evolution of the attention mechanism from BERT to ModernBERT
Medium · Deep Learning
Pony.ai Unveils NVIDIA-Powered Domain Controller for L4 Autonomy
Pony.ai and NVIDIA collaborate on a domain controller for L4 autonomy, enhancing large-scale autonomous driving deployment
Dev.to AI
Your LLM budget alerts won't save you if you can't map costs to users
Learn to map LLM costs to users to avoid unexpected expenses, despite having budget alerts
Dev.to · John Medina
Up next
5 Levels of AI Agents - From Simple LLM Calls to Multi-Agent Systems
Dave Ebbelaar (LLM Eng)
Watch →