Robot telemanipulation in unstructured environments : sensors, algorithms, interfaces
Abstract/Contents
- Abstract
- This dissertation presents methods for robot teleoperation, or equivalently, human-in-the- loop robotics. Human-in-the loop systems have the potential to handle complex tasks by combining the cognitive skills of a human operator with autonomous tools and behaviors. Along these lines, we present novel methods in grasp planning, haptic (force-feedback) rendering, and robot control which allow synergy in interaction between a human operator and a robot. We describe the interfaces that employ these algorithms, and validate them through user experiments. Our goal is to see robot technologies make a bigger impact in peoples' everyday lives, getting robots out of the laboratory and factory, and into homes, offices, and other unstructured human spaces. Our algorithms focus on three distinct areas of telerobotic manipulation but are unified by their common reliance on 3D point cloud data obtained from emerging sensor technol- ogy; we do not depend on environment or object models known a priori since it difficult to anticipate the things a robot will encounter in unstructured settings. First, since grasp- ing is a prerequisite for many manipulation tasks, we present two algorithms for planning grasps on clusters of 3D points. Next, we explore how to perform force-feedback haptic rendering of 3D point cloud data. This enables an operator to use the sense of touch to learn about environment geometry and potential collisions. Finally, we present a controller that uses a sequence of convex optimization steps to produce constrained arm motions that follow time-varying goal poses commanded by an operator. Using 3D sensor data to form motion constraints in real-time, the robot is responsive to changing goals from the user yet also avoids collisions and unfavorable arm configurations. We demonstrate the integration of our algorithms into a telerobotic system that enables an operator to perform varied and unscripted manipulation tasks in arbitrary settings. We describe tools for navigation, perception, and manipulation, ranging from direct control of a gripper or mobile base to autonomous sub-modules that perform collision-free base navigation or arm motion planning. Most importantly, we share results from testing these interfaces in a variety of settings, including user studies with non-expert operators and a case study with a motor-impaired operator using the robot in his own home.
Description
Type of resource | text |
---|---|
Form | electronic; electronic resource; remote |
Extent | 1 online resource. |
Publication date | 2013 |
Issuance | monographic |
Language | English |
Creators/Contributors
Associated with | Leeper, Adam Eric |
---|---|
Associated with | Stanford University, Department of Mechanical Engineering. |
Primary advisor | Salisbury, J. Kenneth |
Thesis advisor | Salisbury, J. Kenneth |
Thesis advisor | Cutkosky, Mark R |
Thesis advisor | Khatib, Oussama |
Advisor | Cutkosky, Mark R |
Advisor | Khatib, Oussama |
Subjects
Genre | Theses |
---|
Bibliographic information
Statement of responsibility | Adam Eric Leeper. |
---|---|
Note | Submitted to the Department of Mechanical Engineering. |
Thesis | Thesis (Ph.D.)--Stanford University, 2013. |
Location | electronic resource |
Access conditions
- Copyright
- © 2013 by Adam Eric Leeper
- License
- This work is licensed under a Creative Commons Attribution Non Commercial 3.0 Unported license (CC BY-NC).
Also listed in
Loading usage metrics...