21 Aug 2014

Using the Mind to Control Robots

From Our Changing World, 9:06 pm on 21 August 2014

With a wireless headset designed primarily for computer gaming, PhD student Nathan Scott can control a robot with his mind.

By lifting his eyebrows he can make the small table-top robot move forward on its wheels, by looking to the left it moves left, and by looking to the right the robot moves to the right. And within minutes of putting the headset on, Ruth Beran can control the robot too.

“One of the real, significant advantages of this technology, is that a complete novice can sit down and just start driving a robot, or control a computer cursor, or control an exoskeleton, with very, very minimal training,” says Nathan who is studying in the School of Computer and Mathematical Sciences at Auckland University of Technology.

“As opposed to other brain-computer interface systems where the training can take hours, days, months depending on how complex the task is,” he says.

The headset only costs about $700 compared with medical-grade EEG machines which can cost about $100,000.

Both systems record the electrical potentials on the scalp, allowing a non-invasive extraction of the signals produced in the brain. The headset that Nathan is using to control the robot has 14 pads damped down with saline to conduct electricity better.

“So the electrical currents on the surface of your head represent – to a certain degree – the amount of electrical activity that’s going on inside your head, which represents which areas of your brain are active at a certain time,” says Nathan.

Denise Taylor and Nathan Scott with the mind-controlled robot and headset, and Ruth Beran wearing the headset after controlling the robot

Denise Taylor and Nathan Scott with the mind-controlled robot and headset, and Ruth Beran wearing the headset after controlling the robot Photo: RNZ / R. Beran

To control a robot or other interfaces though, the electrical information needs to be interpreted as meaningful data rather than just noise. To do this, the team at AUT has developed a system to classify the patterns that exist when someone is thinking.

“NeuCube is a model shaped like the brain using brain-like computing techniques,” says Nathan. “We record the data, we preprocess it to some extent…and then feed this into a NeuCube reservoir.”

From there, the arbitrary brain patterns are classified. “We’ve shown that this model can classify with a very, very high accuracy whether someone is even imagining moving their hand left or right,” says Nathan.

While the robot that Nathan is demonstrating has been trained on facial movements and is using a very simple version of the NeuCube system, the model is being used to anaylse data from radio astronomy, weather, and earthquakes with good preliminary results.

In the meantime, Associate Professor Denise Taylor from the Health and Rehabilitation Research Institute at AUT can see the benefit of NeuCube for people who’ve had stroke, traumatic brain injury or any other sort of injury where there’s a limitation to the control of movement

“Our next step from here is trying to link the NeuCube with the headset and electrical muscle stimulation,” says Denise. “So the person can think of a movement that they want to do and the electrical muscle stimulation will produce the movement.”