Presented by Supermicro and AMD
Media and entertainment has been irrevocably changed by virtual production technology, enabled by advanced chips and CPUs. In this VB Spotlight, you’ll learn how it empowers art and creativity , how these advances are driving innovation, with a dive into recent examples.
The last three to five years have seen a massive transformation in how movies can be made with virtual production. Advanced chips and servers from companies like AMD and Supermicro have opened up the possibilities for visual effects and virtual production.
Today virtualization not only removes the limitations of physical sets and locations, but aids collaboration and optimizes production workflows and resource utilization. Advances in rendering and storage enable more complex visual effects and speeds up production timelines and time-to-market, plus gives creatives the tools they need to tell the stories they imagine.
Companies like Supermicro are developing next-generation rackmount servers to reduce render times, advanced workstations to enable better collaboration and storage solutions that move data faster and more efficiently, with powerful processor innovation from AMD. Here’s a look at how this technology is not only transforming popular franchises but has changed how the entertainment industry works.
An evolutionary leap in virtual production
The pandemic couldn’t shut down the media and entertainment (M&E) industry entirely. Remote production became key, allowing hundreds of collaborators across the country (and the globe) to come together to keep making movies and television.
Seamless collaboration calls for powerful virtual machines or servers that can be provisioned and scaled up or down as required – which also leads to greater efficiency, optimized production workflows and resource utilization. Studios are realizing not only cost savings, but creative teams are able to source the best of talent anywhere.
Virtual production has also opened up tremendous creative opportunities for filmmakers, because virtualization removes the limitations of physical sets and locations, opening up diverse and imaginative worlds with more complex geometry, larger scenes – because bigger media files are now possible – but still enjoying faster loading on the production side. Higher core counts in a denser space, with higher clock speeds, deliver significantly more processing power than ever before, making high-end workloads for in-camera visual effects accessible from anywhere, at any time.
With real-time virtual environments, a sunset can last for 10 hours, or an actor can be transported from the Gobi Desert to the Antarctic via virtual production, on the same stage. Actors no longer have to look at a green screen, but can actually see and interact with the background around them – an evolution of the tech side of visual storytelling.
New possibilities for rendering and storage
With tight production timelines and short deadlines, keeping quality high while producing work quickly requires slashing the time it takes to generate complex visual effects.
Rendering is a big part of that. Not only do large files take time to process, they also burn a lot of energy – and that all translates into production costs and production schedule holdups. With technology like Supermicro’s high-performance and multinode servers, which are powered by AMD’s EPYC processors, artists get high core counts, maximum throughput—and fast rendering.
It’s also a challenge to store and transfer literal terabytes of rendering and composition data both quickly and securely. But high-performance storage and networking devices can move data fast, eliminating bottlenecks and overheating. And being able to store and move data quickly, especially when working with a virtual team, speeds up the overall production timeline, allowing for quicker iterations, and ultimately faster time-to-market.
For example, Industrial Light & Magic (ILM) worked with Supermicro and AMD to develop StageCraft, a highly realistic virtual environment where LED walls replace traditional green screens, allowing actors to interact with lifelike surroundings, enhancing performance and visual authenticity, while also allowing for quick production transitions. A technology like this requires real-time rendering engines to generate visuals in sync with camera movements and lighting.
Harnessing enough power, and fast
Special effects traditionally require a truly tremendous amount of compute power. To harness the necessary power, bigger studios use render farms, or collections of networked server-class computers working together to process data fast. Today ILM uses the Supermicro BigTwin, with parallel-processing power from AMD’s EPYC CPUs, which allows machines to complete more tasks simultaneously, cut production times and slash costs.
These advanced tools and technology mean that actors can give more nuanced performances, and artists can push the boundaries of storytelling. Deadlines will always be tight, but with faster processing times, efficient workflows, speedier rendering, and new capabilities, like footage being sent directly from the camera into pre-production, artists have more time than ever to create the work they envision.
- Virtualization and collaboration, production workflows, resource utilization, and the limitations of physical sets and locations
- Advances in rendering and storage speeds, complex visual effects, speeding up production timelines and time to market
- A look at the way real-time rendering engines can stretch the boundaries of filmmaking
- And more
- James Knight, Global Media & Entertainment/VFX Director, AMD
- Erik Grundstrom, Director, FAE, Supermicro
- Dean Takahashi, Lead Writer, GamesBeat (moderator)