Kerris unveiled the company’s vision for real-time AI as the future of media creation, saying it will fundamentally reshape how stories are told across film, TV, gaming, and live broadcasts, showcasing tools designed to empower developers and accelerate production workflows with intelligent, scalable, and personalised content delivery.
Artificial intelligence (AI) is no longer just a backroom tool for content production—it’s now central to how stories are being created, told, and experienced in real time. Speaking at the inaugural World Audio Visual and Entertainment Summit (WAVES) 2025 in Mumbai, Richard G. Kerris, VP at Nvidia, laid out how AI is changing the entire media and entertainment value chain, with a growing ecosystem of developers and next-gen tools.
Nvidia’s pitch to the industry is clear: AI isn't replacing the artist; it’s accelerating creativity. “Whether it’s a movie, a TV show, or a game—it’s really about the story. These tools help you get to that point faster,” Kerris told the audience, spotlighting the company’s developer-first strategy and initiatives such as RTX Kit, HoloScan for Media, and NIM (Nvidia Inference Microservices).
According to him, the core of the AI revolution lies with developers. “Developers are key to what's taking place with AI, because they understand how an application works, and they can harness the power of AI and bring it to fruition with the work that's being done together.”
And at the heart of this transformation is real-time AI—a shift from traditional post-production efficiencies to live, interactive, and intelligent content delivery. Kerris highlighted HoloScan for Media, Nvidia’s new software-defined platform that enables real-time AI integration into live broadcasts. “Where you once needed costly, dedicated hardware, you can now configure things on the fly—with real-time AI driving decisions like language changes or personalised feeds,” he explained.
Nvidia’s ecosystem now includes close to 30,000 generative AI companies, a sharp increase from the previously cited 22,000. “That’s how fast it’s growing,” Kerris said, adding that partners range from established firms like Adobe to newer players like Runway and Arctris.
Nvidia’s focus on developers also led to the rollout of NIM, microservices for tasks like translation, visual effects, or interactive overlays. These can be combined into “blueprints” tailored for different production needs. “It’s growing fast, and we’re here to provide the platform—our job is to support the developers who are shaping the next generation of storytelling,” Kerris said, adding that such tools are designed to scale from consumer setups to data centre-grade workflows with a unified software stack.
In media workflows, this means content can now be reformatted automatically—for instance, adapting 16:9 cinematic content to vertical formats for mobile viewing—without breaking character continuity. “Engagement and personalisation in these experiences is what AI is helping to transform,” said Kerris.
Although, with evolution of AI there’s a growing concern of job losses, he stressed that these tools aren’t about replacement but they’re about expanding what’s possible. “It doesn't replace the artist out there. What it does is it accelerates the capability for an artist to tell their story,” Kerris concluded.
Fortune India is now on WhatsApp! Get the latest updates from the world of business and economy delivered straight to your phone. Subscribe now.