Yixuan Huang

Hello, my name is Yixuan Huang. I am currently a Postdoctoral Research Associate at Princeton University, working with Prof. Tom Silver. I completed my Ph.D. in the Kahlert School of Computing at the University of Utah, where I was advised by Prof. Tucker Hermans. During my Ph.D., I was a Visiting Student Researcher (VSR) at Stanford University, working with Prof. Jeannette Bohg. I got my bachelor's degree in Computer Science and Engineering from Northeastern University in 2020. During my undergraduate years, I worked with Prof. Sicun Gao at UC San Diego.

Here is my Research Statement.

yh1542[at]princeton.edu  /  CV  /  Google Scholar  /  Github  /  Linkedin

profile photo
Research
    My research aims to develop general-purpose robotic systems capable of structured and adaptive physical reasoning. Robots that interact with the physical world must reason about the kinematic and dynamic constraints imposed by their embodiment, their environment, and the task at hand. These often-entangled constraints can turn semantically simple tasks into challenging puzzles. To address these challenges, my research develops algorithmic frameworks that integrate perception, reasoning, learning, and planning within structured representations. Concretely, I have developed:
  • Methods for learning models that enable hierarchical planning directly from partial-view point clouds (RD-GNN, Points2Plans).
  • Systems that detect failures, recover from them, and leverage them to reduce future errors over time (Fail2Progress).
  • A benchmark that systematically studies robot physical reasoning across 25 environments (KinDER).
Publications
RoboVista figure
RoboVista: Evaluating Vision Language Models for Diverse Robot Applications

Shuangyu Xie, Kaiyuan Chen, Ziyang Chen, Simeon Adebola, Yixuan Huang, Zehan Ma, Tianshuang Qiu, Wentao Yuan, Dhruv Shah, Pannag R Sanketi, Ken Goldberg
In submission to Robotics: Science and Systems (RSS), 2026
project page (coming soon) / arXix (coming soon)

Planning for Multi-Object Manipulation with Graph Neural Network Relational Classifiers

Yixuan Huang, Adam Conkey, Tucker Hermans
IEEE International Conference on Robotics and Automation (ICRA), 2023
project page / arXiv / code

Toward Learning Context-Dependent Tasks from Demonstration for Tendon-Driven Surgical Robots

Yixuan Huang, Michael Bentley, Tucker Hermans, Alan Kuntz
International Symposium on Medical Robotics (ISMR), 2021 (Best Paper Award Finalist, Best Student Paper Award Finalist)
project page / arXiv

Undergraduate research project: This project focused on addressing safe reinforcement learning problems. Our goal was to design RL algorithms that maximize cumulative rewards over time while avoiding collisions. I began with a classic racecar example and created a simulation environment using PyBullet. To achieve our goal, we designed two sub-policies to address the objective separately and employed an additional factored policy to select between the sub-policies. Our final evaluation demonstrated that we achieved near-zero violations with low sample complexity during the testing benchmarks.

Undergraduate research project: In this project, we used a drone to fly around an object and automatically capture a set number of photos for a 3D reconstruction task. We combined a reinforcement learning algorithm with state estimation to find the optimal drone trajectory for achieving high-quality 3D reconstruction.

Ph.D. first-year rotation project: In this work, we take steps toward developing a system capable of learning to perform context-dependent surgical tasks by learning directly from expert demonstrations. To achieve this, we present and evaluate three approaches for generating context variables from the robot's environment. The environment is represented by partial-view point clouds, with approaches ranging from fully human-specified to fully automated.

During the middle of my Ph.D., I worked with the KUKA iiwa robot equipped with a 3-fingered underactuated hand with built-in TakkTile pressure sensors. I designed various manipulation primitives (e.g., grasp, place, dump, push) and executed them on the KUKA iiwa robot. I published three papers (RD-GNN, eRDTransformer, and DOOM and LOOM) with this robot.

During my visit to Stanford University, I worked with a customized holonomic mobile base combined with a kinova arm. This robot is capable of navigating, grasping, placing, pushing, pulling, and even tossing! Using these primitives, the robot can even feed you snacks (e.g., an apple). I completed a project (Points2Plans) with this impressive robot.

For the final project of my Ph.D., I am working with a Stretch robot. I enjoy working with this lightweight yet capable robot. I will design multiple primitives for the Stretch and focus on my final project, which involves reasoning about failure cases.

  • International Symposium on Medical Robotics Best Paper Award Finalist (2021)
  • International Symposium on Medical Robotics Best Student Paper Award Finalist (2021)
  • National Scholarship by Ministry of Education of China (2017)
  • National Scholarship by Ministry of Education of China (2018)


Website source from Chris Agia