Unleashing the Power of 5G and 6G for the XR Revolution with NVIDIA Omniverse and NVIDIA GPUs

 


At the same time, extended reality (XR) technologies, including virtual reality (VR), augmented reality (AR) and mixed reality (MR), will radically change how we interact with digital media content, the success of an immersive XR revolution crucially depends on advanced wireless networks with the performance, reliability and scalability required to support such applications. This is why the fifth-generation (5G) and the forthcoming sixth-generation (6G) cellular networks will bring a wide range of capabilities that are tailor-made to support XR applications.

5G Networks: The Foundation for XR Growth

5G (fifth-generation) networks, now emerging globally and with peak data rates as high as 20 Gbps, latencies as low as 1 millisecond, and the capability to support up to 1 million devices per square kilometer, provides the basic performance and scale necessary for XR applications.

Aside from looking good on the balance sheet, its huge bandwidth promises high-resolution, 360-degree streaming for VR so that users can experience adequately detailed virtual realities without having to be at home with all those thick, data-hungry wires. And since 5G is also low-latency, users can interact with one another in AR, whether it’s players in shared gaming spaces, collaborators sharing a AR workspace, or shoppers and vendors in a round of animated purchasing.

6G Networks: Taking XR to New Heights

While 5G is establishing the basic architecture for the XR revolution, the development of 6G networks, expected around 2030, will push wireless capabilities to new limits, providing peak data rates of 1 Tbps, sub-millisecond latencies, and even higher device densities. These developments will unlock the power of XR as never before, blurring the distinction between physical and digital worlds.

6G’s extremely high throughput makes it possible to stream human-eye-resolution VR, enhancing immersion to levels that can be indistinguishable from real life. 6G’s ultra-low latency will make it possible to track subtle motions to millimeter precision, share that data with haptic systems, and produce sensations in users’ minds and bodies that are as convincing as they say you haven’t experienced extreme entertainment crime until you’ve seen something like this.

Added to this, 6G’s connectivity density will be able to support extremely high levels of XR device density, enabling large-scale shared XR experiences. Imagine attending a concert in XR with thousands of other users, each with their own photorealistic avatar, or participating in an AR game across a whole city with each player’s actions being instantly integrated. These types of social and collaborative XR applications are enabled by 6G networks.

Technical Challenges and Enablers

Delivering on the promise of 5G and 6G for XR will depend critically on overcoming a number of technical challenges. In particular, the network resources available to carry XR traffic must be prioritized and correctly treated in order to optimize the latency, jitter, and bandwidth of XR content that is being requested and used. New dynamic spectrum sharing strategies — whereby multiple services, e.g., high-definition video, 5G gaming, and M2M applications, will be allowed to flexibly and dynamically coexist and share the same so-called frequency bands — will form an integral part of optimizing the spectrum use for XR applications.

This is why 5G, which incorporates the feature called network slicing, and its successor, 6G (which will experiment further with slicing and splitting spectrums), will enable the creation of micro‑networks dedicated to XR requirements. In this way, XR traffic is segregated from other types of data flows, thus giving XR applications guaranteed quality of service without interruptions and failures.

Edge computing will also be critical to support reactive and immersive XR. Locating much of the computation and storage necessary for XR nearby rather than in the cloud can help reduce the latency of data processing, which is critical for real-time rendering, interaction and synchronisation required for the virtual-reality effect.

5G and 6G networks could incorporate AI/ML algorithms that enable dynamic and adaptive end-to-end optimization that caters specifically to the requirements of XR applications and the users’ experiences. This could include predictive modeling of network usage to dynamically reallocate resources at the base station and the radio network controller levels so that users browse uninterrupted.

There are still some technical issues to be ironed out, particularly in the development of 5G and 6G networks, as well as the definition of standards. But appearances indicate that the massive investment is bearing fruit, as research and standardisation is being carried out in the telecommunications space to address such issues, and establish the groundwork for the deployment of 5G, and 6G networks able to support XR. These include technical committees and standard‑developing organizations such as the 3rd Generation Partnership Project (3GPP), the International Telecommunication Union (ITU), and the Institute of Electrical and Electronics Engineers (IEEE).

The Future of XR: A Transformative Impact

But these are just the beginning. As 5G and 6G networks mature, and as XR devices grow more powerful and affordable, it won’t be long before XR is at the heart of our daily lives. For entertainment and leisure, activities that once produced a passive spectator will give way to immersive media experiences and a world that can be explored — whether that means entering our favourite movie, or visiting a theme park, or being immersed in a totally realistic game.

XR will transform education and training by giving students and workers practical hands-on learning opportunities — those who want it, that is. It will enable people to walk amongst and explore historical, archeological and cultural sites that would otherwise remain inaccessible, either geographically or due to ecological, logistical or financial constraints. They’ll be able to do the same with scientific discipline-themed sites, weather events or climate impacts of tomorrow. Even experiencing disease vectors or what the world will look like as a result of climate change could be accomplished using XR. From more mundane perspectives, it’ll enable apprentices to learn a new skill or follow a recipe step-by-step. In addition to using XR for self-guided learning in classrooms and at work, or at the leisure of those looking to advance themselves, XR will also dramatically transform how we train workers. New hires to existing jobs or entirely new roles will inevitably require company scholars, mentors and sponsors to guide them through the learning process.

The healthcare sector is a candidate to make significant use of the advanced XR and wireless networking capabilities. Telemedicine applications will exploit XR to enable remote consultation and treatment between doctor and patient. Doctors will examine patients’ conditions virtually and direct care remotely, wherever the patient might be. XR will also enable future medical professionals to practice surgical procedures and diagnose conditions in virtual situations.

In business, XR will transform collaboration and communication. Remote workers will collaborate in shared contexts, providing a feeling of ‘presence’ and enabling workflows. Virtual conferences and trade shows will become more immersive and effective, allowing participants to interact with products and services in ways that are realistic, even if the product actually exists elsewhere.

Coupled with 5G and 6G wireless networks, XR technologies will usher in a new age of immersive experiences that will transform the way we live, work and play. The unprecedented high bandwidth, low latency and dense connectivity capabilities of next-generation wireless networks will empower powerful, interactive and fast XR apps.

We have an innate human instinct to communicate and connect, and I believe that in the coming years — as we tackle the difficulties and opportunities of 5G and 6G — XR will become more natural, baked-in and seamless, and the real and virtual will become increasingly indistinguishable from each other. There will be infinite opportunities for innovation, creativity, empathy and humanity.

What the XR revolution, facilitated by 5G and 6G networks, might precipitate is a change in our perceptual habits — a cognitive paradigm shift that will transform and augment the world we perceive. As this new future emerges, it is important to remember that there are ethical, social and regulatory caveats we ought to grapple with. Any new technology that materially departs from the old can bring about enormous new benefits, and we must reach out to include them in the benefits accruing from them.

This path for XR to permeate society and for 5G and 6G networks to be widely deployed is even now firmly in motion; all that can be said is that the possibilities are all-encompassing and imaginative. The future continues to beckon those of us with foresight and tenacity who are here to push the boundaries of the possible and encourage human potential to reach its highest potential.

Enabling XR with NVIDIA Omniverse and Advanced GPUs

If we are to achieve the vision for XR in 5G and 6G networks, we must develop world-class computing platforms and tools. NVIDIA is a great example of a world champion in GPUs and AI computing and can help us boost the development and deployment of XR applications.

NVIDIA’s Omniverse, a ‘creation engine’, allows for the creation and experience of vast numbers of virtual worlds and environments at extreme fidelity, in near-real time. That is, it provides a platform for co-creation by designers and artists and engineers, in real-time over the internet, in the design and production pipelines of ‘everything from games and films to reconstructed cities’. Built on NVIDIA GPUs, it is an online platform that the artists and scientists who use it will bring ‘realism and physics-based lifelike fidelity to the things they create’, such that the objects in view are like things in real life machine Vision (2023), and things that we will build in the future will learn to know things too as places come to life: making their own inferences, based on the sensorium of the world, because we’re not just connecting GPUs, servers, data pools, the internet of things, and so on, we’re tapping into the economy of existence.

In XR, Omniverse can be employed for rapid prototyping and building interactive virtual scenes that can be used to design and iterate XR experiences. Its real-time feedback and P2P streaming functionality shorten the iteration cycles of XR setups, while its support of popular off-the-shelf tools and formats allows it to integrate with existing XR development pipelines.

NVIDIA Omniverse XR environment offers a platform to enable a digital twin or an avatar-based representation of a real object, system, or environment that can be tested and optimized to serve as a prototype for XR application development and deployment before its use in the real world. The 5G and 6G networks will enable XR participants located geospatially or across temporally to access Omniverse-based digital twins virtually. With 5G and 6G networks, real-time design collaboration among XR application developers, artists, designers, workers, reviewers, producers, and end users can take place in the form of virtual conferences or meetings.

For instance, NVIDIA develops the hardware technology behind the high-performance computing power that robust XR applications usually demand. Its GPUs (or Graphics Processing Units), including the H100, L40S and even the latest Grace Hopper Superchip, the largest, most powerful processor in the market, are designed for ‘record-breaking performance and incredible efficiency’. XR-savvy GPUs models. The world’s largest data centers, including Meta, Tencent, and NVIDIA, are where big corps have their innovation labs and loads of data-processing and tech, all playing home to the future of Metaverse/XR.

NVIDIA H100 GPU borrows the same NVIDIA Hopper architecture found in our Hopper CPU. The GPU employs a record-breaking 80 billion transistors to enable next levels of general-purpose computing at the accelerated performance for AI, graphics, HPC, and XR workloads. Boasting specs far beyond any compute, the H100 delivers billions of computations per second for XR workloads in rendering and AI requirements. It helps accelerate rendering and AI-driven graphics capabilities such as NVIDIA RTX and image processing for real-time SLC in XR applications. At the compute level, XR applications need petabytes of high-performance memory for running real-time ray tracing, photorealistic graphics and demanding simulation requirements.

The Grace Hopper Superchip (GH200), designed to improve XR performance for the most demanding workloads, is the industry’s first GPU architecture to combine its Hopper GPU with NVIDIA’s Grace CPU into one package. The chip’s newfound energy efficiency and performance are due to the Grace CPU’s ability to draw on the high-bandwidth memory found inherently within the GPU.

In addition to the H100 and GB200 GPUs for driving vision computing, NVIDIA also announced the L40S GPU, which is optimized for the evolution of the XR application ecosystem within the Omniverse. The L40S GPU is designed with high-performance graphics and AI workloads in mind.

L40S GPU provides massively parallel computing performance with CUDA cores and Tensor Cores. In addition, it supports new graphics APIs, such as Vulkan and DirectX, enabling developers to create XR applications with advanced rendering techniques easily. With Nvidia’s comprehensive line of developer solutions, developers can focus on creating immersive, impactful and inclusive experiences for their customers.

Integration with Omniverse is another benefit of this L40S GPU, because the GPU has been explicitly designed to work in harmony with real-time collaboration and simulation capabilities offered by the NVIDIA Omniverse platform. This precision integration allows developers to devote less time to figuring out how to render experiences, and instead, they can iterate faster on the XR experience by relying on unified workflows. The L40S GPU’s compute capabilities can render the physics of complex, realistic environments within the Omniverse platform at Samsung’s 600ppi resolution, as well as AI-powered character animation.

It taps into the array of NVIDIA SDKs and tools including NVIDIA RTX, NVIDIA PhysX and NVIDIA AI, to create hyper-realistic imagery, realistic physics interactions and intelligent behaviour for XR applications in a fast and efficient manner. Together with Omniverse, the L40S GPU creates a compelling ecosystem for XR development, helping developers realize ‘the dream of what’s possible’ with respect to realism, interactivity, and presence. With the L40S being cloud-native, meanwhile, Omniverse’s collaborative features can enable teams of developers to create and optimize XR workflows, regardless of where they are working from.

As 5G and the next-generation 6G (sixth-generation) networks become more developed, the L40S GPU and Omniverse will provide the new wave of XR applications that will be used to develop and deliver richer and more immersive collaborative experiences. The bandwidth, the latency and the density of the coming 5G and 6G wireless networks will allow the creation of XR experiences using the L40S GPU and Omniverse that will one day be able to be streamed and delivered remotely to any type of device.

L40S GPU and Omniverse also enable highly detailed digital twins and high-resolution 3D models representing real-world objects, systems, or environments. Developers use digital twins to ‘train’ XR experiences for real-world deployments and then make live revisions to boost usability, effectiveness, and ROI. Omniverse trains developers to execute their algorithms on newly created digital twins that can be rendered in real-time thanks to the L40S GPU’s onboard compute resources. The 3D L40S GPU-powered industry model of the Chevrolet Camaro ZL1 shown above was simulated, finessed and rendered in real-time using artificial intelligence (AI). This rendered image represents an actual car that doesn’t physically exist.

With the development and demand of ‘brick-in-the-wall’ immersive applications and interactive XR experiences, the L40S GPU and Omniverse as an essential XR IaaS (infrastructure as a service) for large-scale rendering and analysis of such computing intensive applications, such as AI systems for participatory simulation and rendering will become an invaluable software asset for developers and content creators. This will allow metaverse applications to leverage the full performance potential of next-gen 5G and 6G networks.

But the true future of XR is not hardware and compute architectures alone, but what we do with them, the applications we end up creating, the new experiences, the profound ones that will inspire us and have a meaningful impact on our lives and the world. For the developer, both the L40S GPU and Omniverse are going to be crucial to this direction. Designing and building the next generation of XR applications leveraging these two technology superpowers of NVIDIA, and of course, backed by the NVIDIA developer program, give you a huge leg up on your competitors — and will define the future of XR, within these new high-performance digital XR environments through the next evolution of 5G and 6G networks. NVIDIA Omniverse convergence tools, via its developer program, with NVIDIA GPUs such as the H100, L40S, and GH200, can now provide you with the capability to create XR applications with photorealistic and fully interactive behavior — you can create scenes with super high-resolution 3D visuals; full body, dynamic, behaving latest hardware and software stack from NVIDIA.

Only time will tell whether or not XR will evolve into new and exciting forms along with these GPUs and as these platforms, like NVIDIA Omniverse (see figure below), mature, but I think the possibilities for new creative and innovative forms are unprecedented. XR is now entering its adolescence and, soon enough, we will begin to see radically new kinds of interactions with and experiences of both ‘physical’ and ‘digital’ realities. New tertiary forms of experience could also provide a solution to society’s fossil-fuelled energy dilemma! Figure 40 below is a screen grab capturing a 3D avatar inside the new NVIDIA Omniverse simulation environment.

When XR combines with these 5G and 6G networks to offer something with NVIDIA technologies, the result might be, ‘it can be all of these things.’ I argue that these foundational technologies, if advanced and expanded, would lead to new kinds of experiences: stemming from this are a staggering number of new XR use cases in entertainment and education, telemedicine, human augmentation, digital twin manufacturing, and so forth.

Comments

Popular posts from this blog

OCI Object Storage: Copy Objects Across Tenancies Within a Region

Religious Perspectives on Artificial Intelligence: My views

How MSPs Can Deliver IT-as-a-Service with Better Governance