NTU MARS Lab Multimodal AI and Robotic Systems Lab

Projects

Our lab embodies the open-source spirit, driving cutting-edge robotics and AIoT systems through innovative AI algorithms, datasets, and systems, with real-world applications spanning smart cities, smart homes, healthcare, and personalized assistive technologies.

REI-Bench
REI-Bench Benchmarking Vague Human Instructions in Task Planning

Model and explore how vague human instructions affect embodied task planning.

HoloLLM
HoloLLM Multisensory Foundation Model for Language-Grounded Human Sensing and Reasoning

Multimodal Large Language Model integrating rare but powerful sensor inputs to achieve seamless human perception and reasoning.

Humanoid Teleoperation
Humanoid Teleoperation Humanoid Robot Teleoperation Technique based on Meta Quest VR Headsets.

Real-time, complex humanoid teleoperation via Meta Quest VR headsets, achieving diverse manipulation tasks and whole-body locomotion.

More

SenseFi Library
SenseFi Library Library and benchmark

The world-first open-source benchmark on deep learning-empowered WiFi sensing.

MM-Fi Dataset
MM-Fi Dataset Multimodal 4D Human Dataset

The world-first multi-modal non-intrusive 4D human dataset (RGB, Depth, LiDAR, mmWave, WiFi).

ARID Dataset
ARID Dataset A New Dataset for Human Action Recognition in the Dark

Dataset for CVPR UG2+ Challenge (2021-2022)

Transfer Score
Transfer Score Benchmarking UDA

Unsupervised evaluation and model selection for unsupervised domain adaptation (UDA).