The advantages of XR virtual production are prominent, including greatly improved shooting coordination efficiency, reduced post-production workload and costs, a stronger sense of presence and interactivity in shooting, richer and more flexible shooting techniques, and faster scene changes. This makes XR virtual production technology more and more widely used in virtual studios.
1. Technical System Architecture of XR Virtual Studio
In addition to the cameras, lighting systems, and audio systems in a traditional studio, an XR virtual studio system also includes LED screens and their playback control systems, camera tracking systems, and 3D graphics virtual engine servers. The technical architecture is shown in the following figure.
(Source: disguise website)
On the basis of the traditional studio lighting and audio equipment systems, the integration and cooperation of the above software and hardware systems constitute a complete XR virtual studio technical system to produce various fantastic and dazzling XR visual effects. On this basis, dynamic tracking systems or dynamic capture sensors can also be added to achieve AR tracking and other AR interactive effects.
In the actual construction of the XR virtual studio system, an XR media service is the core.
2. The Basic Process of XR Virtual Production
First, the pre-designed 3D virtual scenes are imported into the XR server, and parameters such as camera positions and LED screen space positions are configured.
During shooting, the tracking system transmits the camera parameters to the XR media server and the virtual engine.
The XR server rendering engine calculates and processes the scene images based on the camera position tracking data, projects part of the 3D virtual scenes in real time to the LED screen, and combines them with real-world characters and objects. The hosts, actors and other on-site participants can see their status in the virtual space in real time and immerse themselves in the virtual space.
At the same time, the camera transmits the real scene content captured in the LED screen space (including characters) back to the XR server. The XR server matches the real scene position with the virtual scene position, performs positioning correction and color correction, and expands and synthesizes the camera shooting content and virtual scene content. The final output image is an expanded reality scene achieved by overall synthesis.
During the whole process, the 3D virtual scenes projected onto the LED screen will change synchronously with the camera's position changes to compensate for parallax and ensure that the 3D virtual scenes captured by the camera have a realistic sense of space.
3. Selection of Core Software and Hardware System Solutions
With the rise of XR virtual production, software and hardware platform solutions supporting XR virtual production are increasing. Currently, relatively mature solutions include Disguise, Zero Density, Pixotope, Brain Storm, etc.
Camera Tracking Technology Systems
For tracking device selection, similar to green screen virtual production, professional tracking device and service providers include Mo-sys, Stype, BlackTrax, Optitrack, Vicon, etc. They are divided into active and passive infrared optical capture, and their tracking data and positioning adjustments can be integrated into the XR media server. These traditional virtual production tracking devices are highly professional and can meet the needs of various medium and large virtual productions, but at the same time, the equipment usage costs are also very high, and small studios generally cannot afford them.
With the development of virtual production technology, cost-effective tracking solutions are also emerging. SEEDER's robotic crane system and virtual head tracking system are innovative solutions to the virtual production, greatly reducing the difficulty and cost of shooting virtual scenes and improving the efficiency of virtual production.