Robotics Lunch and Learn: Startup Edition

June 24, 2025 at 06:00 PM
to 08:00 PM
Metro Region Innovation Hub - Portland, OR
Register Here
​The automation industry is currently undergoing a great disruption and we believe it is about it experience its "ChatGPT Moment"

Join us for this exploration of the current state of robotics, machine learning, and human-computer interaction. 

​This workshop, hosted by Joseph Cole, PhD of Rose City Robotics, offers a unique opportunity to engage directly with cutting-edge research and technology.

​We'll start with an update on the state of the art in robotics and the disruption that is happening in the traditional automation industry.

Then we'll cover a high-level overview of machine learning and how transformer neural networks are applied in robotics.

​Followed up with a lunch and networking session.

Whether you are an entrepreneur, investor, creative designer, software engineer, tinkerer or researcher, come join us for this exploration of the opportunities in robotics.

​​​​Run of Show:
  • ​10:45am-11:00: Check-in
  • ​11:00-11:15: Welcome and intros
  • ​​11:15-12:00: Technical talk by Joseph Cole, PhD
    • ​Update on the state of the art in the industry and the future of physical AI
    • ​Overview of Transformer Neural Networks and their application in robotics
  • ​​12:00-1:00:
    • ​Lunch + Networking
    • ​Hands-on teleoperation of Standford's ALOHA open-source robot
    • ​Announcements, job openings, local events and community building

​Teleoperate Two Robotic Arms to Collect Datasets

​We are experimenting with the ALOHA robot, and it is built to the open-source hardware specs published by Stanford here. ALOHA allows a human to teleoperate two robotic arms. The human can move their hands, wrists, elbows and shoulders in natural movements and the two follower robotic arms will mimic their actions. Three onboard video cameras and a laptop record all of the joint positioning data, visual data and mapping to the robot joints.

​With 50-100 examples of a human doing a task, Stanford researchers were able to demonstrate transferring the skill to the robot to be performed autonomously, using Transformer Neural Networks with around 85-95% accuracy.

​We will have our robot at the workshop and encourage lots of tinkering and playing, stacking blocks, doing tasks, collecting datasets, and troubleshooting the inevitable hardware issues.

Subscribe to our newsletter

The latest educational news and articles, sent to your inbox weekly.