XR LED (XR VideoWall Rendering Engine)

Utilizing UNREAL ENGINE 5.4.4, this real-time 3D UHD XR VideoWall rendering engine includes four DP outputs. It features an open control protocol that supports automatic network-based loading of content and templates, as well as on-demand playback control, enabling automated production workflows for broadcast studios. The video output achieves top-tier AAA quality, integrating advanced Lumen global illumination and reflection, Nanite virtualized micro-polygon geometry, and virtual shadow mapping. The system’s Blueprint scripting eliminates the need for manual coding.

Utilizing UNREAL ENGINE 5.4.4, this real-time 3D UHD XR VideoWall rendering engine includes four DP outputs. It features an open control protocol that supports automatic network-based loading of content and templates, as well as on-demand playback control, enabling automated production workflows for broadcast studios. The video output achieves top-tier AAA quality, integrating advanced Lumen global illumination and reflection, Nanite virtualized micro-polygon geometry, and virtual shadow mapping. The system’s Blueprint scripting eliminates the need for manual coding.


The studio’s VideoWall graphics system utilizes high-performance HD real-time 2D/3D rendering hardware, delivering real-time, high-quality rendering of HD images/video hybrid templates, with DP signal output to the video wall processor.


Operating on the Windows 11 24H2 64-bit graphics OS, the software offers an intuitive interface and modular design for flexible expansion. It supports diverse and visually engaging graphic formats that convey substantial information, real-time pop-up effects, and HD video clip integration. Additionally, it integrates external data sources (weather, stock, SMS, traffic updates) through ODBC, XML, and RSS.


Supporting infinite-layer control with independent playout capabilities, the system allows unlimited CG Graphics elements (images, text, videos, 3D objects) within various scenes, constrained only by rendering capacity. Elements can exist in independent layers, enabling non-interfering, independent control. Defined switching relationships between layer elements prevent overlaps and playback conflicts, ensuring secure broadcasts. There are no restrictions on the number or types of elements that can be assigned switching relationships; all switching configurations are managed in the design, simplifying safe operation.


The engine enables real-time rendering of 3D objects, 3D texts, 3D images, and 3D curves, allowing adjustments during broadcasts through the studio control terminal. It guarantees smooth, stable, and anti-aliased rendering even during camera motion. When the camera focuses on virtual 3D objects or video windows, AR/VR Graphics visuals remain clear without digital artifacts. The system also supports various video formats, including AVI, MPEG4, MPEG2, MOV, H.264, FLASH, and WMV, with full-scene anti-aliasing, 32-bit true color, and hardware acceleration for pixel-level lighting.


The system employs a separation of playback control and rendering workstation structures, with the rendering workstation and control workstation being distinct and connected via a network. This design ensures that faults in the broadcasting workstation do not affect the output of the rendering workstation and supports a primary and backup broadcasting safety mechanism. Both the main and backup units include complete rendering and control systems, with real-time cross-hot standby capabilities for both rendering and control functions.

 

Feature Parameters:

1. Support for various tracking system integrations, including OptiTrack、Vicon、TrackMen、Ncam、RedSpy,Mosys,HTC Vive、SEEDER robotic crane, etc., with unified spatial coordinates.

2. Support for distortion correction and offset calibration of cameras and lenses of any model to ensure spatial perspective alignment in the final virtual and real composite images.

3. Support for controlling SEEDER robotic cranes and other devices to automatically calibrate internal and external camera parameters.

4. Position calculation accuracy up to millimeter-level precision, with a calculation time delay of less than 30ms.

5. Supports the rapid generation of key parameters for video wall, making it convenient to quickly generate video wall projection models in UE that are spatially aligned with the positioning system.

6. Data frame rate adaptation, supporting commonly used positioning data frame rates such as 24fps, 25fps, 30fps, 50fps, 60fps.

7. Support for real-time network data stream output for use with UE plugins or third-party software.

8. Supports rapid calibration and alignment based on the physical display screen, obtaining the accurate position of the video wall in space.

9. Support for recording the final trajectory data of virtual cameras in UE for direct use in post-processing workflows; precise frame interval clipping of trajectories is supported.

10. Supports one-click rapid generation of video wall models, ensuring alignment of the video wall's coordinate system in UE with the calibrated coordinate system of the positioning system; eliminating the need for additional alignment adjustments.

11. Alignment of virtual rendering frames with real-shot frames, with support for manual adjustment of frame offsets to adapt to various usage scenarios, ensuring strict frame alignment of background layers, real-shot video layers, and AR layer content.

12. Support for nDisplay-based internal cone VFX shooting, supporting synchronized rendering of unlimited camera positions.

13. Support for XR screen expansion shooting, ensuring spatial alignment of content inside and outside the screen without any displacement.

14. Support for in-screen and out-screen color calibration to correct color changes that occur when content is projected and then captured by the camera, automatically correcting to ensure smooth color transitions between in-screen and out-screen images in the final output image without color deviation.

15. Support for timecode-based data output, ensuring that the output image results are in the world coordinate system of the UE scene for direct use in post-rendering.

16. Supports quick navigation with one-click access to set items, reducing the time spent searching for settings among various complex functions in UE. Definable quick navigation settings can also be added to improve work efficiency.

17. Support for simultaneous shooting with multiple mobile positions and real-time signal switching.

18. Includes three sets of basic broadcast studio scene assets for demonstration and testing purposes.

19. Trigger control of VR/AR systems devices through network protocols, automatic loading of content and templates, instance generation, scheduled broadcast triggering, and other functions to achieve automated production in the broadcast studio.

 

System includes:

Windows 11 24H2 64-bit Graphics Operating System

Licensing Management System

UE5 Rendering Engine Management System

 

Workstation configuration:

1x 64-bit Processor, 12 Cores, 24 Threads, 4.7-5.6GHz

64GB DDR5 (2x32GB) Memory

512GB + 1TB m.2 SSD

NVIDIA A6000 Graphics Card

NVIDIA Sync Card

Dual Network Cards: 2.5Gb + WiFi6

1200W Hot-swappable Dual Power Supply

Includes one-year hardware warranty, one on-site installation service, and three years of technical support.