For Employers
Research Assistant (Visual-Language Manipulation)


NATIONAL UNIVERSITY OF SINGAPORE
8 days ago
Posted date
8 days ago
N/A
Minimum level
N/A
Full-timeEmployment type
Full-time
OtherJob category
Other
Interested applicants are invited to apply directly at the NUS Career Portal

Your application will be processed only if you apply via NUS Career Portal

We regret that only shortlisted candidates will be notified.

Job Description

This position involves working on a project focused on efficient multimodal robot learning for manipulation, with emphasis on vision-language-action (VLA) systems. The candidate will help in bridging simulation and real robot systems to enable robust, safe manipulation in real environments.

The candidate will:

• Contribute to building manipulation pipelines that combine perception, language, and control.

• Implement and evaluate safety and uncertainty-aware modules to monitor and filter robot behaviors.

• Perform data collection, calibration, and annotation on robotic manipulators and mobile manipulation platforms (such as Mobile ALOHA).

• Develop and maintain simulation environments in Isaac Lab / Isaac Gym / PyBullet / Gazebo for training and testing.

• Work with large manipulation datasets (e.g. LIBERO, RoboCasa, DROID) to guide model training, generalization, and benchmarking.

• Collaborate with the PI and research team to design experiments, analyze results, document findings, and support dissemination (e.g. internal reports, code releases).

Qualifications

• Strong programming skills in Python (experience in C++ is a plus).

• Experience with ROS / ROS2, and robotics simulation tools (e.g. Isaac Lab / Isaac Gym / PyBullet / Gazebo).

• Background in robot manipulation, motion control, and trajectory planning.

• Familiarity with vision-language models / architectures (VLMs/VLAs) or multimodal learning in robotics.

• Experience or strong interest in robot data collection, teleoperation, calibration, and evaluation.

• Exposure to large-scale manipulation datasets such as LIBERO, RoboCasa, DROID, or similar.

• Preferred: experience with Mobile ALOHA or mobile manipulation platforms.

• Good analytical, troubleshooting, and experimental design skills.

• Ability to work independently as well as collaboratively within a research team.
Related tags
-
JOB SUMMARY
Research Assistant (Visual-Language Manipulation)
NATIONAL UNIVERSITY OF SINGAPORE
Singapore
8 days ago
N/A
Full-time

Research Assistant (Visual-Language Manipulation)