Uni-Gaussians: Unifying Camera and Lidar Simulation with Gaussians for Dynamic Driving Scenarios

Zikang Yuan1*, Yuechuan Pu2*, Hongcheng Luo2*, Fengtian Lang3, Cheng Chi2, Teng Li2, Yingying Shen2, Haiyang Sun2†, Bing Wang2, Xin Yang3
1HongKong University of Science & Technology
2Xiaomi EV
3Huazhong University of Science & Technology

*Indicates Equal Contribution

Abstract

Ensuring the safety of autonomous vehicles necessitates comprehensive simulation of multi-sensor data, encompassing inputs from both cameras and LiDAR sensors, across various dynamic driving scenarios. Neural rendering techniques, which utilize collected raw sensor data to simulate these dynamic environments, have emerged as a leading methodology. While NeRF-based approaches can uniformly represent scenes for rendering data from both camera and LiDAR, they are hindered by slow rendering speeds due to dense sampling. Conversely, Gaussian Splatting-based methods employ Gaussian primitives for scene representation and achieve rapid rendering through rasterization. However, these rasterization-based techniques struggle to accurately model non-linear optical sensors. This limitation restricts their applicability to sensors beyond pinhole cameras. To address these challenges and enable unified representation of dynamic driving scenarios using Gaussian primitives, this study proposes a novel hybrid approach. Our method utilizes rasterization for rendering image data while employing Gaussian ray-tracing for LiDAR data rendering. Experimental results on public datasets demonstrate that our approach outperforms current state-of-the-art methods. This work presents a unified and efficient solution for realistic simulation of camera and LiDAR data in autonomous driving scenarios using Gaussian primitives, offering significant advancements in both rendering quality and computational efficiency.

Illustration

Illustration Image

Framework

Method Overview. Gaussians of all elements are defined in their local or canonical spaces, and are deformed and transformed into the world space at a given time t. We perform unifying camera and LiDAR simulation for the whole dynamic driving scenarios. For camera image data, we employ rasterization for rendering. For LiDAR data, we compute the intersection between ellipses and rays to construct ray-tracing.

Framework Overview

Visual Comparisons

Lidar4D
Ours
Lidar4D
Ours
Lidar4D
Ours
Lidar4D
Ours

Videos

BibTeX

@article{yuan2025uni,
  title={Uni-Gaussians: Unifying Camera and Lidar Simulation with Gaussians for Dynamic Driving Scenarios},
  author={Yuan, Zikang and Pu, Yuechuan and Luo, Hongcheng and Lang, Fengtian and Chi, Cheng and Li, Teng and Shen, Yingying and Sun, Haiyang and Wang, Bing and Yang, Xin},
  journal={arXiv preprint arXiv:2503.08317},
  year={2025}
}