Compelling robot behaviors through supervised learning and choreorobotics

Placeholder Show Content

Abstract/Contents

Abstract
As robots transition from industrial and research settings into everyday human environments, robots must be able to (1) learn from humans while benefiting from the full range of the humans' knowledge and (2) learn to interact with humans in safe, intuitive, and social ways. This thesis describes a series of compelling robot behaviors, where human perception and interaction are foregrounded in a variety of tasks. Supervised learning is incorporated in three projects to improve robots' capabilities in dynamic, changing environments. In the first project, robots learned a door opening task from human teleoperators. In the process of data collection, many failure modes would occur, such as the human teacher using the robot to apply too much force to the door handle or experiencing uncertainty about when the robot was contacting the door. In order to improve this, we wrote software to map the robot's force-torque sensor data to real-time haptic feedback for the human demonstrator. We found that when haptic feedback was provided to the human during the data collection process, performance improved for both human data collection (efficiency of learning from demonstration teacher examples) and autonomous robot door opening (successful deployment of the learned policy on the robot). In the second project, robots learned to navigate in response to human gestures by using imitation learning paired with model predictive control. Humans and the environment were observed through a series of images, which were fed to a neural network that outputted navigation points. These navigation points were fed to a model predictive control algorithm. This was a novel combination of a high-level neural network with a low-level navigation planner. We observed that trained policies correctly responded to human gestures in the majority of cases during different gesture scenarios. Thus, visual imitation learning paired with model predictive control effectively resulted in gesture-aware navigation. In the third project, robots' movement generated music in real time through software. Through two experiments, we showed that augmenting a robot with motion-generated music increased robot likeability and perceived intelligence. In the fourth project, robots learned to move in groups based on a choreographer's preferences while running the music generation software from the third project. In an experiment for the fourth project, we demonstrated that our multi-robot flocking system with gesture and musical accompaniment effectively engaged and enthralled humans. We trained a model to select between different navigation modes, based on how a human choreographer made selections. We observed that humans' perceptions of the robots and their overall experience are not significantly altered by the method (human choreographer selection, model prediction, or a control condition) in which the robots' behaviors were modulated. The compelling robot behaviors described in this thesis elucidate how teaching interfaces and interactions with robots in everyday settings can be appealing, effective, and delightful.

Description

Type of resource text
Form electronic resource; remote; computer; online resource
Extent 1 online resource.
Place California
Place [Stanford, California]
Publisher [Stanford University]
Copyright date 2023; ©2023
Publication date 2023; 2023
Issuance monographic
Language English

Creators/Contributors

Author Cuan, Caitlin Rose
Degree supervisor Okamura, Allison
Thesis advisor Okamura, Allison
Thesis advisor Follmer, Sean
Thesis advisor Kennedy, Monroe
Degree committee member Follmer, Sean
Degree committee member Kennedy, Monroe
Associated with Stanford University, School of Engineering
Associated with Stanford University, Department of Mechanical Engineering

Subjects

Genre Theses
Genre Text

Bibliographic information

Statement of responsibility Caitlin Rose Cuan.
Note Submitted to the Department of Mechanical Engineering.
Thesis Thesis Ph.D. Stanford University 2023.
Location https://purl.stanford.edu/kb164ng7952

Access conditions

Copyright
© 2023 by Caitlin Rose Cuan
License
This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).

Also listed in

Loading usage metrics...