SPEAKERS

LISTEN, LEARN, GET INSPIRED.

Day 1

Day 1

Day 2

Day 2

Day 1

Tuesday, January 27th

Day 1

Tuesday, January 27th

8:00 AM - 9:00 AM

STAGE 1

Check-in

Networking/Sponsor Table Visits

9:00 AM - 9:10 AM

STAGE 1

Production Summit Welcome Day 1

9:10 AM - 9:30 AM

STAGE 1

WePlay Demo

Hosted by WePlay, this session features a short video presentation followed by a brief discussion exploring how real-world spaces, culture, and play are being reimagined through digital experiences. The video sets the stage with examples of how physical environments and objects can be transformed into interactive, shareable worlds, extending their life, reach, and impact far beyond their original context.

9:30 AM - 10:10 AM

STAGE 1

Keynote - Know Your Foe

The Creative and Operational Realities of AI-Driven Production

This panel brings together Phil Tippett, members of Tippett Studios, and LOFTAPPS to discuss the evolving role of artificial intelligence in contemporary creative pipelines. Panelists will share perspectives on how emerging AI tools can be integrated into production workflows, the opportunities and challenges they present, and what this shift means for artists and studios navigating new technologies. The conversation offers an inside look at how creative teams are thinking about AI today and how it may shape the future of content creation.

10:10 AM - 10:50 AM

STAGE 1

Global Objects Video / Physical AI Panel

10:10 AM - 10:50 AM

STAGE 2

Applied AI in Visual Effects

Let's explore how AI—across text, image, video, and 3D models— and how it can dramatically accelerate creative and production workflows. Introducing an agentic, multimodel ecosystem where different AI agents assist with ideation, prompt enhancement, story creation, script formatting, character development, and shot planning. The deck demonstrates how AI can merge concepts, generate cinematic imagery, build storyboards, create consistent characters and environments, and recolor or restyle visuals with precision. It highlights fine‑tuning techniques, custom model training, and specialized pipelines for lighting reconstruction, video outpainting, and animation transfer. Advanced video tools show how AI can control camera paths, generate character motion, maintain continuity, and transform footage into new worlds. We Want to also showcase examples of image‑to‑3D workflows and emphasizes AI as a powerful creative assistant that increases efficiency while keeping humans in control of storytelling and direction.

10:50 AM - 11:30 AM

STAGE 1

Global Takes

Redefining Production Across LA, London, and Beyond

As the boundaries between physical locations and creative collaboration dissolve, this panel explores how media and entertainment professionals are navigating the new era of hybrid production. Whether you're based in LA but leading projects in London, directing virtually from Lisbon, or producing content for a global audience while grounded in Hollywood, the rules have changed—and so have the opportunities.

Join us for a conversation co-curated with London & Partners, spotlighting the cultural synergies and business models emerging from the creative crossroads of London, Los Angeles, and other international production hubs. We'll dive into the realities of borderless collaboration, the rise of remote studios, and how creatives are building globally resonant content with local impact.

11:30 AM - 12:10 PM

STAGE 1

Pioneering the Future of Virtual Production

As virtual entertainment continues to grow worldwide, WePlay Studios is pushing the boundaries of what virtual production can be. Through projects like the VTuber Awards and the Genius Invokation TCG Series, WePlay has set a new standard by combining live motion capture, virtual sets, and next-gen broadcast workflows into truly immersive experiences. With hundreds of millions of viewers following VTubing globally - and the Western market still in an early growth phase - WePlay is helping shape how this space evolves.

Beyond large-scale shows, WePlay is also focused on making high-end virtual production more accessible. By adapting advanced tools into lighter, remote-friendly pipelines, broadcast talent can now cover virtual events from their own personal setups without sacrificing quality.

This approach came to life through collaborations with HoYoverse and Mythic Agency, where WePlay helped elevate both the Genius Invokation TCG series and the VTuber Awards by blending AR, VR, and mixed reality directly into the shows’ design and mechanics - creating standout productions that continue to influence how virtual events and esports broadcasts are built worldwide.

11:30 AM - 12:10 PM

STAGE 2

Building What Lasts

Creativity, Technology, and the Power of Community

Sustainability isn’t just about materials; it’s about people, systems, and the communities that carry ideas forward. This talk explores how creative and tech leaders can build work that lasts through intention, responsibility, and collaboration. Drawing from The Creative + Tech Orbit and Secret Level’s community,  we’ll look at how storytelling, technology, and partnerships can create sustainable creative ecosystems. 

12:10 PM - 1:10 PM

STAGE 1

NETWORKING LUNCH / WEPLAY TOURS

1:10 PM - 1:50 PM

STAGE 1

Volumetric Reality in Real-Time

Mastering Gaussian Splatting for LED Volumes with GO Stage and Pixera

Join Global Objects CTO Erick Geisler and Abbi DeLeve from Pixera (ETC) for an in-depth exploration of GO Stage, the revolutionary Gaussian Splatting (GS) playback plug-in designed to bring photoreal volumetric data directly into the heart of the live production pipeline. This session demonstrates how GO Stage eliminates the need for cumbersome meshing by allowing immersive-media creators and motion graphics artists to import, layer, and cue high-fidelity splat data natively within Pixera and Adobe After Effects. Attendees will discover how to harness GPU-accelerated shaders for real-time rendering on LED volumes and projection domes, seamlessly syncing complex 3D environments with camera tracking and timeline cues. By merging Global Objects’ "ground truth" capture with Pixera’s industry-leading media server logic, this panel reveals the new standard for deploying interactive, cinematic environments in real-world stages and live event workflows.

1:10 PM - 1:50 PM

STAGE 2

Scaling Film Productions with AI Workflows

Go behind the scenes of the groundbreaking AI short film, The Bends, developed under the Entertainment Technology Center (ETC) to explore how AI is reshaping the future of filmmaking. The Bends, an artist driven fable, pushed the boundaries of traditional workflows using AI-integrated pipelines. In this panel, filmmakers and tech leads reveal how they reimagined editorial, VFX, and animation processes to deliver ambitious visuals without ballooning budgets or teams. From agile pipelines to ethically trained models, this session offers an inside look at what’s possible when artificial intelligence becomes a central creative force in professional production.

1:50 PM - 2:30 PM

STAGE 1

From Studio to Stream

Producing for YouTube at Scale

Jess Loren—Founder and CEO of Global Objects—sits down with Anthony Baroud, one of YouTube’s most influential creators with over 25 million subscribers, for a candid conversation about what it really takes to produce for YouTube at scale. Together, they unpack how modern production has evolved beyond traditional studios, blending cinematic craft, creator authenticity, and data-driven decision-making. Anthony shares hard-earned insights on building sustainable formats, managing production teams, and designing content that performs across a global audience—without losing creative voice. Jess brings a producer’s perspective, exploring how emerging tools like virtual production, digital environments, and scalable workflows are reshaping what’s possible for creators and brands alike. This conversation is a must-watch for producers, creators, and media executives looking to understand how YouTube has become one of the most powerful production platforms in the world and how to build content that thrives there.

1:50 PM - 2:30 PM

STAGE 2

Virtual Production Car Process

Where Gen-AI, Game Engines, and Traditional Video Fall Short

As generative AI, real-time game engines, and traditional video plates become more common on LED stages, many productions assume car process is a solved problem. In practice, it often breaks down under the pressures that matter most to producers: tight schedules, repeatability across episodes, continuity between shots, and confidence that what works on day one will still work weeks later.

This session examines where these approaches fall short when applied to car interiors, particularly around long-take continuity, lighting consistency and perspective control. Drawing from real production experience, the speakers outline a hybrid, production-first methodology that prioritizes predictability, flexibility, and creative control over novelty.

Rather than focusing on tools, the talk centers on outcomes: what reduces reshoots, minimizes on-set uncertainty, and scales reliably for episodic television and feature work. Attendees will leave with a clearer understanding of how to approach virtual production car process in a way that protects schedule, budget, and creative intent.

The approach discussed reflects why Sim-Plates was created in the first place: to deliver production-ready car process environments that hold up under real schedules, real cameras, and real creative demands.

2:30 PM - 3:10 PM

STAGE 1

After the Collapse

Clarity, Power, and Survival in the New Hollywood

As we move into 2026, the demand for "ground truth" precision in virtual production has never been higher. Stype continues to lead the industry by providing the essential link between physical movement and digital reality. In this forward-looking session, the Stype team offers an exclusive look at their latest innovations and the technological breakthroughs set to define the next era of camera tracking and immersive broadcasting. Join us to discover how Stype is pushing the boundaries of spatial data and synchronization, ensuring that every frame is captured with the absolute fidelity and millimeter-level accuracy that modern creators demand.

2:30 PM - 3:10 PM

STAGE 2

AI and the Future of Creative Development

Charles Migos, founder of Intangible AI and former executive design leader at Unity, and Philip Metschan, lead product designer of Intangible and long-time veteran of Pixar, will explore how AI is fundamentally reshaping creative development. Not by replacing creatives, but by giving them new ways to think, visualize, align, and iterate together.

They will present how Intangible helps creative teams move from abstract ideas to concrete worlds earlier, faster, and with greater clarity—reducing misalignment across directors, producers, writers, designers, and clients while preserving craft.

3:10 PM - 3:50 PM

STAGE 1

AI, The End of Hollywood
and What Comes Next

Dramatic changes in audience behavior, platforms, and viewing habits combined with runaway production to create an existential crisis in Hollywood, even before the arrival of Cinematic AI. While AI is both an accelerant and a solution, the long-term reality is that the algorithm will transform media as we know it. We may hate it, but the audience will love it.

3:10 PM - 3:50 PM

STAGE 2

Scene Machine

A Sneak Peek at Lightcraft's Spark, a Collaboration Platform for Visual Storytelling

Spark is Lightcraft’s new browser-based platform for collaborative visual storytelling. Think of it as Google Docs for filmmaking. Instead of scripts, storyboards, shot lists, and location references living in separate silos, everything stays connected and explorable in one shared 3D space. Teams can build scenes using scanned real-world locations or AI-generated environments, place characters and cameras, design lighting, and capture shots, all collaboratively in real time from any laptop/desktop.

Spark is designed to help ignite stories early in development and take real pressure off crews by giving teams better ways to plan, align, and execute before time and money are committed on set.

3:50 PM - 4 PM

STAGE 1

BREAK

4 PM - 4:40 PM

STAGE 1

Real-Time Ray Tracing and Open Standards

The Future of Virtual Production with Chaos Arena

Join us for a live demonstration of Chaos Arena, a new real-time ray-tracing platform that’s redefining how filmmakers, artists, and studios approach virtual production. The latest version of Arena embraces open standards like USD and MaterialX, enabling greater compatibility and flexibility across the entire production pipeline, all while keeping the creative process fast and intuitive. By harnessing full ray tracing, Arena delivers uncompromised cinematic quality while actually reducing cost and preparation time. Artists can work directly with their native assets, lighting, and materials, creating a seamless bridge from concept to final pixel. In this session, the Chaos Innovation Lab team will demonstrate how Arena enables faster iteration, simpler workflows, and higher fidelity than traditional real-time engines, and for the first time, we’ll be unveiling brand-new features designed to push virtual production even further. Whether you’re a cinematographer, VFX artist, or virtual production supervisor, this is a look at what’s next.

4 PM - 4:40 PM

STAGE 2

From Scout to Screen

How AI and Real-Time Visualization Are Transforming Production Workflows
Presented by the Television Academy's Special Visual Effects Peer Group

The rapid emergence of virtual production and AI-powered creative tools is redefining how stories move from concept to camera. Nearly every visualization and VFX platform now incorporates some level of automation. There’s still real concern about how these tools might threaten creativity and IP. Fuzzy Door Tech will demonstrate how AI, AR and virtual production technologies are reshaping the future of filmmaking by accelerating workflows and elevating creative clarity.

This session features ViewScreen®, our patented ProVis™ system merges real-time visualization with AR. Attendees will watch as digital characters, environments, and VFX elements appear directly in the camera feed on an LED volume, giving filmmakers the power to see the shot before it’s ever made. This approach collapses the gap between pre-production and post, saving time, reducing uncertainty, and enabling more confident, collaborative choices on set. Our cloud-based system also lets remote teams collaborate in real time.

By uniting virtual production, AI-assisted tools and real-time visualization, this workflow empowers productions from large scale blockbusters to independent filmmakers and storytellers.

4:40 PM - 5:20 PM

STAGE 1

Virtual Production Now

A Roundtable on Tools, Stages, and the Future of Real-Time Filmmaking

This Virtual Production Roundup brings together industry leaders shaping the next era of real-time filmmaking and immersive content. Featuring AJ Wedding (CEO, Orbital), Nathan Bazley (Global Director of Business & Operations, NantStudios), Aleksii Gutiantov (WePlay), and Justin Diener (VP, Synapse), the panel explores how virtual production is evolving across stages, platforms, and pipelines. From LED volumes and real-time engines to scalable workflows and new business models, this conversation offers practical insights into what’s working now—and where virtual production is headed next.

4:40 PM - 5:20 PM

STAGE 1

Beyond the Screen

Engineering the World’s Best Visual Experiences with Christie

Christie Digital’s Joe Conover and Chris Barnett discuss CBS’s virtual projection deployment on the Paramount Lot 3 stage as a shining success in the making, delivering a 100-foot-wide by 24-foot-tall Carbon Black screen illuminated by Christie Sapphire projectors and precision blended with Mystique. Driven by Silverdraft supercomputers and LiveFX, the result is a reliable, high-performance canvas that supports demanding production workflows while remaining straightforward for crews to operate and iterate on quickly. With Christie hardware and commissioning cost at approximately $750 per square foot of wall, the platform compares favorably to a high-performance LED volume. Beyond cost, Christie virtual projection offers materially greater flexibility in configuration, faster reconfiguration between shows, and a more sustainable path for long-term utilization across varied content needs. It also eliminates persistent LED concerns - no moiré, no pixel pitch limitations, and no camera-to-wall constraints driving creative compromises. Color is consistent across the surface and fully adjustable, with HDR and P3-capable looks achievable for premium imaging. Most importantly, it expands creative freedom: the simulation of sunlight is materially closer to natural illumination than RGB LED, enabling more convincing highlights, falloff, and photoreal integration.

5:50 PM - 8:00 PM

STAGE 1

MIXER, NETWORKING

THINK BIG. REALLY BIG

THINK BIG

REALLY BIG