Hello @DataJuggler! Yes, it is possible with Omniverse Cloud. We have an example of this here: Rimac Nevera 3D configurator. You can Launch the Experience which will launch a web-based app inside of Omniverse Cloud.
Fine print: Rimac Nevera 3D Configurator can be launched from the latest Google Chrome or Microsoft Edge Browsers on Windows, macOS, or Chrome OS devices. Support for mobile phones and tablets is coming soon.
Explore the dynamic world of Customer Experience (CX) at CMSWire. Stay updated with the latest news, expert advice and in-depth analysis on customer-first marketing, commerce and digital experience design.
NVIDIA announced NVIDIA Omniverse Enterprise six months later, allowing global 3-D design teams working across multiple software suites to collaborate in real-time in a shared virtual space. Richard Kerris, vice president of the Omniverse development platform at NVIDIA, said in a 2021 press release that NVIDIA Omniverse connects worlds by enabling the vision of the metaverse to become a reality."
Another NVIDIA platform, Omniverse Avatar, debuted in November of 2021. It helps developers generate, animate, simulate and render state-of-the-art interactive avatars for use in NVIDIA Omniverse. It also features AI-enabled toolsets, accelerated rendering and simulation technology, all designed to facilitate the creation of realistic, high-fidelity avatars. While still in the development stage, developers can sign up for the waiting list and can use NVIDIA Omniverse Audio2Face to generate expressive facial animations in the meantime.
More recently, at the NVIDIA GTC AI conference in March of 2022, CEO Jensen Huang discussed how major tech players such as Amazon Robotics, PepsiCo, Kroger and others are using NVIDIA Omniverse Enterprise to build digital twins.
Other announcements included the availability of nearly a million Omniverse-ready 3-D assets and live-sync connections with apps such as Adobe Substance 3-D Materials and Painter, Epic Games Unreal Engine, Maxon Cinema 4-D and NVIDIA OVX, which was designed to operate complex digital twin simulations for NVIDIA Omniverse Enterprise.
Mark Gruenwald, a writer and editor for Marvel, and one of the leading theoreticians on the concept of the omniverse, defined it as "the continuum of all universes, the space/time matrix that comprises all alternative realms of reality." Despite his background, Gruenwald claimed the idea was not limited to comics.
Another multiverse player, Microsoft Mesh, currently available as a preview for Holo Lens 2, allows developers to build immersive, multi-user, cross-platform mixed reality (XR) apps. It enables presence and shared experiences from anywhere on any device. Mesh can be accessed via app using HoloLens 2, VR headsets, mobile phones, tablets or PCs.
Scott Clark is a seasoned journalist based in Columbus, Ohio, who has made a name for himself covering the ever-evolving landscape of customer experience, marketing and technology. He has over 20 years of experience covering Information Technology and 27 years as a web developer. His coverage ranges across customer experience, AI, social media marketing, voice of customer, diversity & inclusion and more. Scott is a strong advocate for customer experience and corporate responsibility, bringing together statistics, facts, and insights from leading thought leaders to provide informative and thought-provoking articles. Connect with Scott Clark:
A USD-based collaboration platform for VFX, game development, design and visualization
Officially launched this January after a year in beta, Omniverse enables artists and designers anywhere in the world to collaborate on projects in real time.
Data is exchanged between compatible CAD and DCC applications and Omniverse in USD format, with connector plugins available for tools including 3ds Max, Maya, Revit, Rhino, SketchUp and Unreal Engine.
Opening up Omniverse to Macs, mobile devices, and workstations with AMD or Intel GPUs
So far, Omniverse has effectively been restricted to users of workstations with Nvidia hardware: its core components and the Omniverse Create and Omniverse View apps only run on Windows or Linux.
That should make it possible to use the Omniverse apps on pretty much any desktop or mobile device: as well as Windows, GeForce Now runs on macOS, ChromeOS, Android and iOS, and in the Chrome browser.
Machine learning tools developers get synthetic data-generation system Omniverse Replicator, plus two industry-specific data generators, Isaac Sim, for robotics, and Drive sim, for autonomous vehicles.
Availability and system requirements
Omniverse Cloud is available free in early access for users with Nvidia Developer accounts. You can find instructions on how to apply for early access here.
Nvidia Omniverse is a computing platform built to enhance digital design and development by integrating 3D design, spatial computing and physics-based workflows across Nvidia tools, third-party apps and artificial intelligence (AI) services. Created specifically for developing applications in the metaverse, the real-time platform is used for building digital twins of products, factories, warehouses and infrastructure. It can also streamline the creation of 3D-related media for entertainment and product demonstrations, as well as enterprise media content rendered on computers, phones and extended reality (XR) devices.
The platform, launched in 2022, is available as a cloud service or a private instance running on premises. Additionally, it supports plugins and integrations for deploying omniverse content, applications and autonomous control systems across cars, robots, building controls, equipment and medical devices.
Nvidia Omniverse helps streamline workflows for designing, simulating and optimizing equipment, products and processes across different roles and expertise for virtual design. For example, Mercedes-Benz and BMW are using it to improve their product and factory designs. It is also helping companies optimize mobile network deployment, warehouse layouts, building construction and smart city deployments.
The platform can also serve as an integration tier for workflows that span tools from different vendors. This can reduce the integration challenges in crafting point-to-point integrations for specific workflows. For example, teams could use design tools from one vendor, simulation tools from another and rendering engines from a third to streamline virtual development efforts.
Nvidia Omniverse, through its Omniverse Replicator and Isaac Sim components, can also help generate synthetic data for testing various autonomous systems, AI algorithms and robot control systems. This function can streamline the development of more capable autonomous cars, warehouse materials handling equipment and robotic controls. The final control software can be sent to various target controllers, including Nvidia-specific embedded hardware or third-party controllers supporting standards such as Unified Robot Description Format or Robot Operating System.
In addition, Nvidia Omniverse also supports more consumer-facing development for generating avatars, asking questions about physical products and visualizing the furniture layout in a 3D representation of rooms.
Omniverse supports various specifications and standards that simplify the exchange of 3D-related data across multiple tools. Nvidia is working with the Universal Scene Description (USD) community to extend the specification to support Material Definition Language (MDL) and PhysX capabilities.
The essential value of the Nvidia Omniverse platform comes from its support of a rich collection of Nvidia, third-party, and open source tools and formats. These include plugins, extensions or services, such as the following:
Nvidia Omniverse helps streamline the development lifecycle of physical products and virtual experiences across various roles and expertise. The platform can help businesses manage the complexity of building new products, designing more efficient facilities and creating more engaging user experiences. Here are some specific ways the platform is used by businesses:
Nvidia Omniverse is currently the most comprehensive platform for integrating 3D and physics-based workflows across various cloud services, third-party applications and rendering engines. The platform's tools and supporting services ecosystem have been undergoing rapid innovation. In the short term, Nvidia said it will continue to improve the integration of 3D workflows with its AI hardware and tools.
Nvidia is also actively working with various industry groups and standards bodies to improve the capabilities of multiple standards, specifications and open source tools. For example, it is a member of the OpenXR community, developing standards to streamline XR and spatial computing experiences across different devices. It is also helping guide the Graphics Library Transmission Format standard for exchanging 3D content for consumer-facing applications. Additionally, it is helping to extend the USD format beyond 3D scenes to support more complex engineering and simulation workflows. Nvidia will continue to help weave these capabilities into the Omniverse platform.
The platform also supports a rich marketplace to make it easier for vendors, domain experts and systems integrators to monetize their expertise and services. It will continue to enrich these offerings. These enhance the ability for enterprise users to mix and match design, development, test and monitoring capabilities across various tools. Nvidia currently has partnerships with leading product lifecycle management, geographic information system, CAD, computer-aided engineering, simulation and gaming engine vendors. Nvidia Omniverse will continue to streamline workflows across these tools.
Back in April, NVIDIA announced several new robotics-related technologies at the 2024 GPU Technology Conference (GTC). These new products included Project GR00T, Jetson Thor, Isaac Lab, OSMO, Isaac Manipulator, and Isaac Perceptor.
c80f0f1006