About the Course

Train and deploy whole-body control systems using Stanford’s Mobile ALOHA platform — a mobile manipulator robot designed for high-dexterity tasks in constrained environments. This course gives hands-on access to real mobile manipulation hardware, cutting-edge imitation learning techniques, and ROS2-based deployment pipelines.

You’ll go from dataset collection to autonomous execution — with code that runs on the real robot.

🧠 What You’ll Learn

Whole-Body Control for Mobile Manipulators
Implement bimanual and base movement control strategies for real-world tasks (e.g., drawer opening, two-handed object transfer).

Imitation Learning & Dataset Bootstrapping
Record demonstrations via whole-body teleoperation and train high-performing policies using supervised behavior cloning.

ROS 2 Multi-Modal Perception Integration
Process 3D video, proprioception, and teleop motor control signals and train a transformer neural network for autonomous operation.

Human-in-the-Loop Debugging & Autonomy Transfer
Refine learned behaviors through real-time evaluation and policy fine-tuning using open-source ALOHA tools.

Final Challenge: Autonomously Execute a Real Task
Cook, clean, manipulate, or navigate: you'll deploy your trained model to execute a mobile manipulation task from scratch.

🧪 Platform Details

  • Hardware: Mobile ALOHA robot — bimanual arms + mobile base

  • Sim: NVIDIA Isaac + ALOHA teleop stack (custom tooling provided)

  • Sensors: RealSense RGB-D 3D cameras + joint encoders + mobile base odometry

  • Software: ROS2 (Foxy), Python 3.10+, PyTorch, Behavior Cloning codebase

  • GitHub Access: Full codebase and open dataset support

📅 Format & Details

Duration: 4 Days, In-Person
Location: Portland State University | Robotics Lab
Cohort Size: 10 Engineers Max
Robot Access: Hands-on with real Mobile ALOHA units (1 per 2 students)

Included:

  • 12-month access to all code, videos, and tools

  • GitHub repository with reproducible projects

  • Certificate of Completion + ROS2 skill badge

  • Real lab and tool task testing arena

  • Mentorship, real-time support

  • Coffee, lunch, troubleshooting sessions

🎯 Who This Is For

Engineers serious about high-DOF robotics and AI deployment. Ideal for:

  • Robotics engineers moving into mobile manipulation

  • AI/ML practitioners wanting real-world Imitation Learning exposure

  • Technical founders building service robots or human-assist platforms

  • PhD students or advanced researchers looking to deepen ROS2 fluency

💻 Prerequisites

  • Python proficiency + ROS2 fundamentals

  • Familiarity with GitHub workflows

  • Some background in ML or motion planning (behavior cloning preferred)

  • We offer a free ROS2 prep module — ask us if you need a refresher.

💰 Tuition

$1,995
Includes full hardware access, GitHub repos, certificate, meals, and post-course code support.

Instructor

  • Joseph Cole

    PhD, Applied Physics

    Joseph earned his PhD in applied physics from Rice University and a graduate certificate in applied statistics from Portland State University. He is a retired Major with the US Army Reserves with over 20 years of experience developing computer vision and machine learning algorithms at companies like Northrop Grumman and Applied Materials.

    + read more

Subscribe to our newsletter

The latest educational robotics news and articles, sent to your inbox weekly.