Fangqiang Ding

Postdoctoral Associate @ Massachusetts Institute of Technology

prof_pic2.png

about me

I am a Postdoctoral Associate at MIT, working with Dr. Hermano Igo Krebs, director of The 77 Lab. Before MIT, I worked with Dr. Or Litany at Technion as a Postdoctoral Fellow. I was honored to be awarded a 2025 RSS Pioneer for my work on robust spatial perception for mobile robotics. I received my Ph.D. in Robotics and Autonomous Systems from School of Informatics, The University of Edinburgh, supervised by Dr. Chris Xiaoxuan Lu, and my B.Eng. in Mechanical Engineering from Tongji University.

🎯 My research agenda centers on Physical AI, which integrates advanced artificial intelligence with physical systems (e.g., self-driving cars, robots, wearables, industrial and IoT devices) to enable them to perceive, reason, and interact with the physical world. My long-term vision is a human-machine symbiotic ecosystem where human and embodied intelligence coexist, collaborate and co-evolve.

🚀 Achieving this vision requires systems that are not only increasingly capable, but also trustworthy and scalable in the real world. Trustworthy systems operate safely, reliably, and robustly across diverse environments and tasks, while protecting human privacy and aligning with human values. Scalable systems can be developed and deployed efficiently at scale through more affordable sensing and computation, as well as data collection, annotation, and learning pipelines. These core values motivate my research to address three key challenges, i.e., condition-adaptive, privacy-aware, cost-effective, for real-world deployment.

đź’ˇ My recent and ongoing research directions include (but not limited to):

  • Multisensory perception for reliable mobile autonomoy in the wild
  • Generalized human motion and interaction sensing in real world
  • Long-horizon robotic (loco-)manipulation across tasks and environments
  • Building deployable data flywheel for scalable robot learning

🤝 If you are interested in these directions and would like to explore collaboration opportunities, please feel free to reach out via email. Email · Resume · Google Scholar

news

Feb 21, 2026 🎉 Two papers (M4Human, PALM) accepted to CVPR-2026. See you in Denver CO.
Oct 01, 2025 đź“– Started to work as a Postdoctoral Associate at MIT. Look for more collabrations.
May 13, 2025 🎓 Successfully pass my PhD thesis viva. Many thanks to the committee and collaborators. Finally become Dr. Ding!
Apr 21, 2025 🤖 Selected as an RSS Pioneers 2025 (competitive early-career recognition from the robotics community). See you in Los Angeles, USA.
Feb 24, 2025 🎉 One paper accepted to ACM SenSys’25. See you in Irvine, USA.
Jan 27, 2025 đź“– Accept to serve as Associate Editor for IROS-2025. Look forward to contribute.

selected publications

  1. CVPR’26
    m4human.png
    M4Human: A Large-Scale Multimodal mmWave Radar Benchmark for Human Mesh Reconstruction
    Junqiao Fan, Yunjiao Zhou, Yizhuo Yang, and 6 more authors

    † Corresponding author

    In IEEE Conference on Computer Vision and Pattern Recognition, 2026
  2. Sensys’25
    thermohands.png
    ThermoHands: A Benchmark for 3D Hand Pose Estimation from Egocentric Thermal Image
    Fangqiang Ding, Yunzhou Zhu, Xiangyu Wen, and 2 more authors
    In ACM Conference on Embedded Networked Sensor Systems, 2025
  3. NeurIPS’24
    radarocc.png
    RadarOcc: Robust 3D Occupancy Prediction with 4D Imaging Radar
    Fangqiang Ding, Xiangyu Wen, Yunzhou Zhu, and 2 more authors
    In Advances in Neural Information Processing Systems, 2024
  4. ECCV’24
    milliflow.png
    milliFlow: Scene Flow Estimation on mmWave Radar Point Cloud for Human Motion Sensing
    Fangqiang Ding, Zhen Luo, Peijun Zhao, and 1 more author
    In European Conference on Computer Vision, 2024
  5. CVPR’23
    cmflow.png
    Hidden Gems: 4D Radar Scene Flow Learning Using Cross-Modal Supervision
    Fangqiang Ding, Andras Palffy, Dariu M. Gavrila, and 1 more author
    In IEEE Conference on Computer Vision and Pattern Recognition, 2023
    Selected as Highlight (top 10%)