XR VideoWall Rendering Engine (XR VW 2110)

Utilizing UNREAL ENGINE 5.4.1, a real-time 3D UHD XR video wall graphics rendering engine featuring two ST2110 4K IP stream inputs and four DP outputs.

Utilizing UNREAL ENGINE 5.4.1, a real-time 3D UHD XR video wall graphics rendering engine featuring two ST2110 4K IP stream inputs and four DP outputs. It offers an open control protocol, enabling automatic loading of content and templates over the network, along with on-demand triggering of playback control functions, facilitating automated studio production. The output video achieves industry-leading 3A effects, notably with the inclusion of Lumen dynamic global illumination and reflection systems, Nanite virtualized micropolygon geometry systems, and virtual shadow mapping. The blueprint system eliminates the need for planners to write code.


The studio's video wall graphics system should utilize advanced HD real-time 2D/3D rendering hardware to perform high-quality rendering of HD images/videos hybrid templates in real-time, and provide DP signals to the video wall processor.


Utilizing a Win11 23H2 64-bit graphical operating system, offering a user-friendly and convenient software interface. With modular design, it can be flexibly expanded according to requirements. The system supports various forms of graphics, delivering novel visual effects to convey a greater amount of information. It supports special effects such as real-time pop-up windows and also facilitates the insertion of high-definition video clips. Additionally, it supports external data connections such as weather, stock market, SMS, and road conditions (ODBC, XML, RSS, etc.).


Support for the switching relationship between infinite layers of broadcast control and graphics elements means that the system enables unlimited layers of independent playback control. In other words, the number of elements (pictures, texts, videos, 3D objects) in various CG graphic scenarios is not limited by the system (only by rendering capabilities). These elements can be located in independent layers, allowing for independent control without interference. Additionally, the system supports the establishment of switching relationships between elements (pictures, texts, videos, 3D objects) in each layer. Playback control personnel only need to focus on the elements they need to play. Predefined switching relationships can automatically move elements located in other layers that may cause overlap conflicts or playback accidents, ensuring maximum playback safety. There are no restrictions on the number and types of elements that can be set with switching relationships. Independent layer playback control and switching only need to be implemented in scene design, freeing playback control personnel from concerns about the implementation method, thus achieving the most simplified and safe playback.


Real-time rendering of 3D objects, 3D texts, 3D images, and 3D curves is supported. Changes can be made in real-time during broadcasting via the studio control terminal. Smooth, stable, and aliasing-free scene rendering is ensured even during camera movement. When the camera focuses on virtual 3D objects or virtual video windows, the graphical clarity of virtual objects remains intact without electronic magnification artifacts. Various video file formats are supported for playback, such as AVI, MPEG4, MPGE2, MOV, H.264, FLASH video, WMV, etc.. Real-time anti-aliasing is supported for the entire scene, with hardware acceleration capabilities, utilizing 32-bit true color and supporting pixel-level light sources.


The system employs a broadcast and control separation structure, with the broadcasting rendering workstation and broadcasting control workstation separated. They are connected via a network, ensuring that faults in the broadcasting workstation do not affect the output of the rendering workstation. It supports a main and backup broadcasting safety mechanism. Both the main and backup hosts include complete rendering and control units, supporting real-time cross-hot standby.


Feature Parameters:

1. Support for various tracking system integrations, including OptiTrack、Vicon、TrackMen、Ncam、RedSpy,Mosys,HTC Vive、SEEDER robotic crane, etc., with unified spatial coordinates.

2. Support for distortion correction and offset calibration of cameras and lenses of any model to ensure spatial perspective alignment in the final virtual and real composite images.

3. Support for controlling SEEDER robotic cranes and other devices to automatically calibrate internal and external camera parameters.

4. Position calculation accuracy up to millimeter-level precision, with a calculation time delay of less than 30ms.

5. Supports the rapid generation of key parameters for video wall, making it convenient to quickly generate video wall projection models in UE that are spatially aligned with the positioning system.

6. Data frame rate adaptation, supporting commonly used positioning data frame rates such as 24fps, 25fps, 30fps, 50fps, 60fps.

7. Support for real-time network data stream output for use with UE plugins or third-party software.

8. Supports rapid calibration and alignment based on the physical display screen, obtaining the accurate position of the video wall in space.

9. Support for recording the final trajectory data of virtual cameras in UE for direct use in post-processing workflows; precise frame interval clipping of trajectories is supported.

10. Supports one-click rapid generation of video wall models, ensuring alignment of the video wall's coordinate system in UE with the calibrated coordinate system of the positioning system; eliminating the need for additional alignment adjustments.

11. Alignment of virtual rendering frames with real-shot frames, with support for manual adjustment of frame offsets to adapt to various usage scenarios, ensuring strict frame alignment of background layers, real-shot video layers, and AR layer content.

12. Support for nDisplay-based internal cone VFX shooting, supporting synchronized rendering of unlimited camera positions.

13. Support for XR screen expansion shooting, ensuring spatial alignment of content inside and outside the screen without any displacement.

14. Support for in-screen and out-screen color calibration to correct color changes that occur when content is projected and then captured by the camera, automatically correcting to ensure smooth color transitions between in-screen and out-screen images in the final output image without color deviation.

15. Support for timecode-based data output, ensuring that the output image results are in the world coordinate system of the UE scene for direct use in post-rendering.

16. Support for quick navigation, one-click access to set items to reduce the time-consuming process of finding settings in various complex functions in UE; custom quick navigation settings can also be added to improve work efficiency.

17. Support for simultaneous shooting with multiple mobile positions and real-time signal switching.

18. Includes three sets of basic broadcast studio scene assets for demonstration and testing purposes.

19. Trigger control of VR/AR systems devices through network protocols, automatic loading of content and templates, instance generation, scheduled broadcast triggering, and other functions to achieve automated production in the broadcast studio.


System includes:

Win11 23H2 64-bit Graphical Operating System

Licensing Management System 

UE5 Rendering Engine Management System

ST2110 IP Data Stream Engine


Workstation configuration:

1x 64-bit Processor, 12 Cores, 24 Threads, 2.8-3.9GHz

64GB DDR5 Memory

NVIDIA RTX 4080 Graphics Card

512GB + 1TB m.2 SSD

1200W Hot-swappable Dual Power Supply

Dual 1Gb Management Network Interface Card

Dual 25Gb Data Network Interface Card