Robotics Lunch and Learn: Startup Edition

May 29, 2025 at 06:00 PM
to 08:00 PM
Metro Region Innovation Hub - Portland, OR
Register Here
Join us for this technical and hands-on exploration at the intersection of robotics, machine learning, and human-computer interaction. 

This workshop, hosted by Joseph Cole, PhD of Rose City Robotics, offers a unique opportunity to engage directly with cutting-edge research and technology.

In addition to our technical talk on applications of AI in Robotics, there will be time for networking along with hands-on interaction with Rosy, our open-source robot inspired by Stanford's ALOHA open-source system. 

We'll start with a lecture on machine learning and how transformer neural networks are applied in robotics. Then we'll have lunch and a hands-on workshop where you'll get to teleoperate dual robotic arms, contribute to real-world datasets, and see where the industry is heading. Come ready to learn, experiment, and help shape the future of robotics.

Whether you are an entrepreneur, investor, researcher, engineer or automation expert come join us for this hands on event.

​Run of Show

  • 10:45am-11:00: Check-in
  • 11:00-11:15: Welcome and intros
  • ​11:15-12:00: Technical talk by Joseph Cole, PhD 
    • Deep dive on Transformer Neural Networks and their application in robotics
    • Update on the state of the art in the industry and the future of physical AI
  • ​12:00-1:00: Lunch + Networking
    • Hands-on teleoperation of the ALOHA open-source robot
    • Job openings, local events and community building

​Teleoperate Two Robotic Arms to Collect Datasets
Teleoperation

​We call our robot Rosy - and it is built to the open-source hardware specs published by Stanford here. Rosy allows a human to teleoperate two robotic arms. The human can move their hands, wrists, elbows and shoulders in natural movements and the two follower robotic arms will mimic their actions. Three onboard video cameras and a laptop record all of the joint positioning data, visual data and mapping to the robot joints.

​With 50-100 examples of a human doing a task, Stanford researchers were able to demonstrate transferring the skill to the robot to be performed autonomously, using Transformer Neural Networks with around 85-95% accuracy.

​We will have our robot at the workshop and encourage lots of tinkering and playing, stacking blocks, doing tasks, collecting datasets, and troubleshooting the inevitable hardware issues.

Subscribe to our newsletter

The latest educational news and articles, sent to your inbox weekly.