About Me

Welcome! I am Yifei Yao, a Master’s student in the School of Automation at Shanghai Jiao Tong University, working in the Machine Vision and Autonomous Systems Lab under the supervision of Prof. Jun-Guo Lu. I am expected to graduate in April 2026 and am actively pursuing Ph.D. opportunities in robotics and embodied AI.

My research focuses on Embodied AI for Humanoid Robots, where I develop intelligent control systems that enable robots to understand and interact with their environments through language-guided planning and decision-making. With a unique interdisciplinary background in both automation and economics, I am particularly passionate about bridging the gap between high-level reasoning and low-level motor control while considering the commercial viability and societal impact of humanoid robotics solutions.

Core Research Contributions

My work centers around developing unified learning frameworks that can generalize across different humanoid morphologies and tasks. My flagship contribution is the Generalized Behavior Cloning (GBC) framework, which addresses fundamental challenges in humanoid robot control through cross-morphology learning and task generalization.

Research Interests

  • Embodied Control for Humanoids: Developing advanced control algorithms for bipedal robots and humanoid systems with focus on commercial deployment feasibility
  • Language Model Guided Planning & Decision Making: Integrating large language models with robotic control for intelligent task execution and cost-effective automation
  • Reinforcement Learning & Imitation Learning: Creating robust learning frameworks for complex robotic behaviors with optimized sample efficiency
  • Computer Vision & Diffusion Models: Applying cutting-edge vision techniques to robotic perception and motion generation
  • Techno-Economic Analysis: Evaluating the economic viability and market potential of humanoid robotics solutions across different industrial sectors

Current Research & Industry Collaborations

I am currently working as a Cooperative Reinforcement Learning Engineer at Baosight Group, where I develop humanoid RL control algorithms and imitation learning systems deployed on their humanoid robots. My research combines technical innovation with market-driven analysis, applying economic principles to optimize deployment strategies and cost-effectiveness:

  • Baosight Group (2024.09-present): Leading development of production-ready humanoid control systems, successfully deploying GBC-based algorithms on commercial humanoid platforms for manufacturing applications. Conducting ROI analysis to identify optimal automation scenarios and cost-reduction opportunities.

  • Limx Dynamics (2024.05-2024.09): Co-researched improved control algorithms for point-foot legged robots, contributing to their next-generation locomotion systems with enhanced stability and efficiency. Applied techno-economic analysis to evaluate market positioning strategies.

  • ZHENDUI Ltd. (2024.01-2024.05): Developed real-time sea surface obstacle detection and ranging systems achieving 10+ FPS on NVIDIA A4000, implementing advanced computer vision algorithms for maritime safety applications while assessing commercial viability and market penetration potential.

  • SAIC Motor (2023.06-2023.12): Created 3D real-time surrounding view systems for MPV vehicles using Android + OpenGL ES, deployed in production vehicles for enhanced driver assistance

  • Shanghai Sport University (2022.09-2023.04): Built real-time 3D human pose estimation systems for athletic performance analysis, supporting multiple Olympic sports training programs

Publications & Research Output

My research has resulted in several high-impact publications that advance the state-of-the-art in humanoid robotics:

Accepted:

  • IROS 2025 (Oral): “AnyBipe: An End-to-End Framework for Training and Deploying Bipedal Robots Guided by Language Models” - Introduces language-guided reward design and evaluation for bipedal locomotion with cross-task generalization capabilities

Published Preprints:

  • arXiv 2024: “GBC: Generalized Behavior Cloning for Humanoid Robots” - A unified framework enabling cross-morphology learning that significantly improves sample efficiency and task generalization in humanoid control

In Progress:

  • Continuing research on multimodal learning frameworks for autonomous humanoid systems

Key Technical Innovations:

  • Developed cross-morphology learning algorithms that enable knowledge transfer between different humanoid platforms
  • Created unified behavior representation frameworks that bridge locomotion and manipulation tasks
  • Pioneered language-guided planning systems for real-time humanoid control
  • Achieved significant improvements in sample efficiency compared to traditional reinforcement learning approaches

Education & Background

  • M.S. in Automation, Shanghai Jiao Tong University (2023-2026)
    • Machine Vision and Autonomous Systems Lab
    • Advisor: Prof. Jun-Guo Lu
    • First Prize Scholarship recipient (2023-2026)
  • B.E. in Automation, Shanghai Jiao Tong University (2019-2023)
    • Zhiyuan Honored Scholarship recipient (2019-2023)
  • B.E. in Economics, Shanghai Jiao Tong University (2019-2023)

Technical Expertise

Programming Languages: C++, Python, Java, Kotlin, CUDA Development Platforms: Qt, OpenGL, Android Studio ML/AI Frameworks: PyTorch, TensorFlow, CuDNN, OpenCV Robotics Simulation: Isaac Sim, MuJoCo, ROS1 & ROS2

Awards & Achievements

  • RoboCup 2021: First Prize
  • MCM/ICM 2021: Outstanding Winner (Meritorious Winner)
  • First Prize Scholarship: Master’s program (2023-2026)
  • Zhiyuan Honored Scholarship: Undergraduate program (2019-2023)

Looking Forward

I am actively seeking Ph.D. opportunities to further advance my research in embodied AI and humanoid robotics. My research vision focuses on developing autonomous humanoid systems that can seamlessly adapt to new environments and tasks through advanced learning paradigms.

Research Vision & Goals:

  • Advancing cross-morphology learning to enable universal humanoid control systems
  • Developing next-generation multimodal AI that integrates vision, language, and physical reasoning
  • Creating scalable frameworks for real-world deployment of intelligent humanoid systems
  • Bridging the gap between laboratory research and industrial applications in humanoid robotics

My long-term goal is to contribute foundational technologies that will enable humanoid robots to become reliable partners in human-centric environments, from manufacturing floors to domestic settings.

Personal Interests: Beyond research, I enjoy developing personal applications, exploring anime culture, playing CRPGs, and practicing traditional Chinese instruments.

Research Materials: My publications and technical reports are available in the Files section, including the GBC framework technical documentation.

Contact: Feel free to reach out at jameswhiteyao@gmail.com or godchaser@sjtu.edu.cn