The NVIDIA Omniverse™ Blueprint for Autonomous Vehicle (AV) Simulation is a reference workflow to create rich 3D worlds for training, testing, and validation. The blueprint contains APIs and services to build and enhance digital twins from real-world sensor data, model physics and behavior of dynamic objects in a scene, and generate physically accurate and diverse sensor data.
With this API-based architecture, the blueprint can be seamlessly integrated into existing workflows, enabling developers to replay driving data, generate new ground-truth data, and perform closed-loop testing.
The blueprint is part of NVIDIA Halos guardian framework for AV safety, which comprises state-of-the-art HW/SW elements, tools, models, and design principles to safeguard end-to-end AV stacks, from the cloud to the car.
The Omniverse Blueprint for AV Simulation includes:
- Neural Reconstruction: Enhances real-world driving logs with novel sensor angles and asset addition/removal.
- Sensor RTX: Renders high-fidelity, physically-based sensor data for sensors commonly used for autonomy, including camera, radar, and lidar.
- AV Sim Enhancement: Provides the physics, behavior, and animation necessary to simulate real-world scenarios.
- Cosmos Transfer WFM: Generates new variations in a given scenario, including weather, lighting, and geography, based on physics provided in Omniverse.