Dock10 is the UK’s leading television facility, located at MediaCityUK in Greater Manchester. It partnered with the University of York and 2LE Media to tackle the creative and commercial limits of green-screen virtual production.
With over £1.1 million in match funding from the Innovate UK BridgeAI programme’s Collaborative R&D grant, the consortium created a suite of AI-powered technologies that make high-quality, immersive content production accessible and affordable for broadcasters and content creators.
The challenge
Virtual production is increasingly used in high-end films and TV, but current solutions are expensive and creatively restrictive. Traditional green-screen virtual production has long struggled with a fundamental problem: lighting does not naturally transfer between the real and virtual worlds. Actors on green screens do not cast shadows or pick up reflections from virtual environments, making scenes less believable and requiring expensive, time-consuming manual post-production. LED volume technology can solve some of these issues, but costs approximately ten times more than green-screen setups and is not suitable for multi-camera productions. As a result, creative and commercial opportunities for live entertainment, game shows and children’s programming have been severely limited. Until recently, AI was not advanced enough to deliver the broadcast-quality results needed for professional TV production.
Dock10 is the UK leader in virtual studio production and the University of York is the UK leader for research into digital creativity. By combining our expertise, we’ve cracked one of the great challenges for virtual studio productions. The AI-powered solutions we’ve developed have the potential to radically improve virtual production workflows and make the production of high-quality cross-reality content accessible to a much wider set of production companies and creative stakeholders.
– Richard Wormwell, Head of Innovation, Dock10
The solution
BridgeAI funding enabled a full year of intensive R&D in a purpose-built studio and provided resources to develop cutting-edge AI tools specifically for broadcast needs. This collaboration between industry and academia resulted in a suite of AI-powered ‘decomposition’ technologies that effectively ‘teleport’ human performers into virtual environments. The AI analyses green-screen video and breaks it down into multiple layers: the performer’s 3D shape, material properties (how skin, hair and clothing reflect and absorb light), and separation of direct versus ambient illumination. Using these layers, 3D graphics engines can simulate physically accurate lighting – casting proper shadows, adding realistic reflections, and making actors genuinely appear to exist within the virtual world. The technology integrates seamlessly into existing green-screen studio workflows and requires no significant additional training, making advanced virtual production much more accessible.
The impact
The AI-powered approach delivers significantly improved visual quality at a fraction of the cost of alternatives. Productions can now use standard green-screen infrastructure, avoiding multi-million-pound investments in LED volume stages and reducing operational costs by a factor of ten. The AI automates complex lighting effects through a low-latency pipeline, dramatically reducing post-production time and cost. Early demonstrator productions have validated that the technology delivers output suitable for major UK broadcasters. As part of the project, 2LE Media developed an innovative children’s drama format, pitched to major UK broadcasters, showcasing how the AI lighting technologies enable action-packed content that would typically require expensive location shoots or extensive manual post-production. The project also produced proof-of-concept demonstrations for entertainment and immersive content.
The future
The project has laid the foundation for market-ready AI technologies, with ongoing development focused on achieving full real-time capability for live broadcast production. Dock10 is expanding its immersive content services and exploring new markets, including high-end TV, advertising and streaming. The consortium aims to generate significant new UK-based revenue through direct production services and technology licensing to virtual production facilities worldwide. The technology also opens opportunities for cross-reality content production – an emerging market estimated at £70 billion globally.
Lighting is one of the core elements of creative expression in any TV show, but the lighting options for virtual studio productions have been extremely constrained. What we’ve achieved through this project is genuinely transformative—we’ve proven that AI can automate complex lighting effects that previously required hours of manual VFX work in near real-time.
– Dr Florian Block, R&D Lead, AI & Immersive at Dock10 and Reader in Digital Creativity, University of York