Scientist creates budget computer eye tracker that could revolutionize mobility
A scientist has created a budget device that can control a computer by tracking eye movement after stumbling on a £9.95 web camera being sold with a games console – a huge saving from the £20,000 that a similar apparatus used for medical research would have cost at the time.
German neuroscientist Dr Aldo Faisal was setting up a laboratory at Imperial College in London when he made the chance discovery.
Faisal and his team reconfigured two of the cameras and fixed them to a harness which attaches to the head, making a £43 device that opens up the use of computers to the 6 million people in the UK with restricted hand movement.
The cameras tell a computer where the user is looking, allowing a cursor to be moved around a screen while a wink enables the click of a mouse. While eye tracking had been done before, the team showed that it could be achieved at a fraction of the cost and could eventually lead to such devices being sold in shops.
Earlier this year, similar technology was used to produce a wheelchair that could be controlled using the eyes. The user can talk while the software detects where they want to go via a £120 eye-tracking bar usually employed to see if people are looking at advertisements.
The software can distinguish between when the user is looking around and when they want to move and the wheelchair responds within 10 milliseconds.
“It may not be the best hardware to do the job but we have by now such good software algorithms that can do data analytics and data processing that you put the intelligence into the software and not the hardware,” Faisal said.
“That is really the transformative thing. So a lot of biomedical engineering devices to help people [are] focused on the hardware – better sensors, better pumps – but most of the time it is how you control stuff, how you analyse the stuff that allows you to do things well.”
Faisal’s lab straddles bioengineering and computing and works to unravel how the brain functions and subsequently how this knowledge can be applied to devices assisting people with restricted mobility.
His discoveries could help amputees, people with paralysis, those with arthritis, and the aged to be more mobile. Since the control of the eye comes straight from the brain, injuries to the spinal cord or amputations do not affect eye movements, said Faisal. Parkinson’s disease can affect eye movements but a device can still be controlled while multiple sclerosis also has little effect for these purposes, he says.
“The actual power of the eye-based user interfaces we build is not actually in the cheap hardware, it just shows that it works on very cheap hardware.
“What no one has done is build decoders which tell what is your intention of action based on how you look at the world and that is what we are really doing. We are building systems which, based on your eye movement behaviour, try and work out what you are going to do next.” A robot arm which is also directed by eye movement has also been developed at the lab.
Faisal compares the budget developments to the human brain. “It is incredibly unreliable in each individual piece so the hardware is really crappy. Our eyes are not particularly amazing. Most of our senses are not amazing. But yet together the whole [thing] works extremely well and extremely reliably and that is because our brain runs amazing software so that’s why I think that it is the smart software and not the expensive devices which can really turn things around,” said Faisal.
Using the eyes as a way to instruct a computer would also be a non-invasive way around a problem, he says. Research has found that people with prosthetic arms take them off before they take off their shoes when they return home, as they are “so unnatural, so alien”.
When a version of Faisal’s video game was brought to an exhibition, 70% of people who tried it could use it after 15 seconds of calibration.
“If we can build a device which works out what you want to do not based upon spending a lot of time working out how you think but simply based upon rules that are given by how the brain works in all of us, I can build devices which just work,” Faisal said.
Cheap widely available eye-tracking devices could be on the market in between two and five years, according to Faisal, who is working on plans to commercialise his work. As society gets older, there will be an increased need for such devices, he says.
Future developments could also help those with dementia, meaning a person with the conditionrecalling an event may have difficulty with specific memories of it. A device – like a pair of glasses or a contact lens – would detect through the eye what they are trying to recall and feed them images in order to help them with the memory, said Faisal.
“It is trying to see how far we can push these non-invasive technologies that decode your intention and make things more interactive and easier,” he said.
Eye-controlled wheelchair: how it works
The eye-controlled wheelchair centres on developments which can separate eye movements used when talking to the eye movements employed when navigating.
The team built software to exploit the differences and put it to use in the prototype wheelchair, using a commercially available eye tracker in a black plastic bar more usually witnessed in advertising. This created what Faisal says is the first device that allows the user to talk and navigate at the same time, without fully focusing on controlling the device.