RadarOcc

Robust 3D Occupancy Prediction with 4D Imaging Radar

NeurIPS 2024

* Denotes equal contribution

1 2 3 4

Pipeline

Overall pipeline of RadarOcc. The data volume reduction pre-processes the 4DRT into a lightweight sparse RT via Doppler bins encoding and sidelobe-aware spatial sparifying. We apply spherical-based feature encoding on the sparse RT and aggregate the spherical features using Cartesian voxel queries. The 3D occupancy volume is finally output via 3D occupancy decoding.

Abstract

3D occupancy-based perception pipeline has significantly advanced autonomous driving by capturing detailed scene descriptions and demonstrating strong generalizability across various object categories and shapes. Current methods predominantly rely on iDAR or camera inputs for 3D occupancy prediction. These methods are susceptible to adverse weather conditions, limiting the all-weather deployment of self-driving cars. To improve perception robustness, we leverage the recent advances in automotive radars and introduce a novel approach that utilizes 4D imaging radar sensors for 3D occupancy prediction. Our method, RadarOcc, circumvents the limitations of sparse radar point clouds by directly processing the 4D radar tensor, thus preserving essential scene details. RadarOcc innovatively addresses the challenges associated with the voluminous and noisy 4D radar data by employing Doppler bins descriptors, sidelobe-aware spatial sparsification, and range-wise self-attention mechanisms. To minimize the interpolation errors associated with direct coordinate transformations, we also devise a spherical-based feature encoding followed by spherical-to-Cartesian feature aggregation. We benchmark various baseline methods based on distinct modalities on the public K-Radar dataset. The results demonstrate RadarOcc’s state-of-the-art performance in radar-based 3D occupancy prediction and promising results even when compared with LiDARor camera-based methods. Additionally, we present qualitative evidence of the superior performance of 4D radar in adverse weather conditions and explore the impact of key pipeline components through ablation studies.

Performance under normal weather

Here are some GIFs showing our qualitative results on 3D occupancy prediction under normal weather. Foreground voxels are colored as red while background voxels are green. Correponding RGB videos are shown for refenrence.

Performance under adverse weather

Here are some GIFs showing our qualitative results under adverse weather. For the second and third demo, LiDAR-based OpenOccupancy and stereo image-based SurroundOcc are used for inter-modality comparison.

Acknowledgements

This research is partially supported by the Engineering and Physical Sciences Research Council (EPSRC) under the Centre for Doctoral Training in Robotics and Autonomous Systems at the Edinburgh Centre of Robotics (EP/S023208/1).

BibTeX

@article{Ding_2024_NeurIPS,
      title={Robust 3D Occupancy Prediction with 4D Imaging Radar},
      author={Ding, Fangqiang and Wen, Xiangyu and Zhu, Yunzhou and and Li, Yiming and Lu, Chris Xiaoxuan},
      journal={Advances in Neural Information Processing Systems (NeurIPS)},
      year={2024}
    }