Scientists at the University of Minnesota have developed a new type of interface that allows humans to control airborne robots with the electricity generated by thoughts, according to research published Tuesday in the Journal of Neural Engineering.
The invention uses a special electroencephalography (EEG) cap to measure the brain’s electrical signals. Software between the user and the robot knows what brain signals look like when someone thinks of raising a clenched fist, so when the user dwells on that image, the robot performs a specific task.
The robot was developed after the team’s lead scientist, Professor Bin He, successfully translated brain signals from an EEG helmet to control a computer-rendered helicopter in a simulation.
For the latest tests, five subjects were fitted with EEG caps containing 64 electrodes that let the custom software “learn” the user’s brain signals when they thought about clenching their firsts, using that thought of movement to queue up directions for the flying drone.
After a little practice, each user was able to use just their thoughts to help the drone navigate an obstacle course.
Similar products are already available to assist disabled people with computer interactions, and researchers have experimented with using EEG machines to help improve concentration in hyperactive children.
Future iterations of this technology could enable people to move wheelchairs or other mobility assistants, turn electronic devices on or off, and one day even communicate over the Internet just by thinking of a message.
This video was published to Youtube on Wednesday, June 5, 2013.
[“Stock Photo: Man Wearing A Brain-Control Helmet, Celebrating With Open Arms” on Shutterstock.]