Using Muscle and Brain Signals to Supervise Robots

From DRLWiki

Jump to: navigation, search

Contents for This Page

Videos

Video accompanying the MIT News Article and RSS 2018 Paper
(view on YouTube)

Overview

What if we could control robots more intuitively, using just hand gestures and brainwaves?

Robots are becoming more common in settings ranging from factories or labs to classrooms or homes, yet there's still somewhat of a language barrier when trying to communicate with them. Instead of writing code or learning specific keywords and new interfaces, we'd like to interact with robots the way we do with other people. This is especially important in safety-critical scenarios, where we want to detect and correct mistakes before they actually happen.

Taking a step towards this goal, we use brain and muscle signals that a person naturally generates to create a fast and intuitive interface for supervising a robot. In our experiments, the robot chooses from multiple targets for a mock drilling task. We process brain signals to detect whether the person thinks the robot is making a mistake, and we process muscle signals to detect when they gesture to the left or right; together, this allows the person to stop the robot immediately by just mentally evaluating its choices and then indicate the correct choice by scrolling through options with gestures.

[View the poster presented at RSS 2018]

People

Joseph DelPreto

Andres F. Salazar-Gomez

Stephanie Gil

Ramin M. Hasani

Frank H. Guenther

Daniela Rus

Publications

RSS 2018 Paper PDF: Plug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detection

RSS 2018 Poster JPG

Abstract: Control of robots in safety-critical tasks and situations where costly errors may occur is paramount for realizing the vision of pervasive human-robot collaborations. For these cases, the ability to use human cognition in the loop can be key for recuperating safe robot operation. This paper combines two streams of human biosignals, electrical muscle and brain activity via EMG and EEG, respectively, to achieve fast and accurate human intervention in a supervisory control task. In particular, this paper presents an end-to-end system for continuous rolling-window classification of gestures that allows the human to actively correct the robot on demand, discrete classification of Error-Related Potential signals (unconsciously produced by the human supervisor's brain when observing a robot error), and a framework that integrates these two classification streams for fast and effective human intervention. The system also allows 'plug-and-play' operation, demonstrating accurate performance even with new users whose biosignals have not been used for training the classifiers. The resulting hybrid control system for safety-critical situations is evaluated with 7 untrained human subjects in a supervisory control scenario where an autonomous robot performs a multi-target selection task.

Joseph DelPreto, Andres F. Salazar-Gomez, Stephanie Gil, Ramin M. Hasani, Frank H. Guenther, Daniela Rus - Plug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detection
Proceedings of Robotics: Science and Systems , Pittsburgh, Pennsylvania, June 2018
Bibtex
Author : Joseph DelPreto, Andres F. Salazar-Gomez, Stephanie Gil, Ramin M. Hasani, Frank H. Guenther, Daniela Rus
Title : Plug-and-Play Supervisory Control Using Muscle and Brain Signals for Real-Time Gesture and Error Detection
In : Proceedings of Robotics: Science and Systems -
Address : Pittsburgh, Pennsylvania
Date : June 2018

Press Mentions

MIT News

TechCrunch

Fast Company

Popular Mechanics

Engadget

Co.Design

Related Projects

blank0

Correcting Robot Mistakes Using Brain Signals

Personal tools