Led by research director, Professor Damien Coyle, the Intelligent Systems Research Centre (ISRC) is engaged in developing the next generation of neurotechnology and human machine interaction.
But those words don’t come near to doing justice to the importance of the work taking place at the centre. Simply put, Prof Coyle and his team are looking at ways to detect and interpret brain signals in ways which will have benefits for people suffering from a wide variety of conditions including stroke, spinal injury, and severe brain injury resulting in a minimally conscious state.
The neurotechnology and brain-computer interfaces which have been developed at ISRC enable communication with patients who cannot move in any way and have applications across a wide range of therapeutic areas.
At its most basic, it starts with an electroencephalogram (EEG), the device consisting of a set of electrodes placed on the scalp to detect and measure brain activity. People are asked to imagine a movement or action and the EEG measures the resulting brain activity. Over time the brainwaves or signals associated with particular actions can be identified.
This knowledge is then taken to the next level by the application of sophisticated artificial intelligence algorithms which translate the complex signals from the brain into usable information.
“An EEG is quite a complex signal,” Prof Coyle explains. “We know that certain frequencies and signals are associated with different things. We can ask a patient to imagine mental tasks or movement or use a visual stimulus to evoke a response. For example, if you place electrodes over the motor cortex and ask the patient to imagine a movement of a limb you would expect to see changes in brain activity on the opposite side of the cortex from the limb. The great benefit is the patient doesn’t have to move for this.”
This opens up the potential for dramatic improvements in the treatment of patients with serious brain injuries. At present, there are over 100 million people around the world who have difficulties communicating and interacting with technology due to physical disability caused by disease or injury. In some cases, people who cannot communicate following severe brain injury may be in a minimally conscious state, or in a vegetative state – termed prolonged disorders of consciousness (PDOC) – because they cannot move to communicate.
We have established partnerships with 16 hospitals across the UK and Ireland
According to Prof Coyle, this is a desperate situation for the patient, the patient’s families and for the clinical teams who are limited by outdated, traditional assessment methods and are in need of new tools to assess patients and to improve patient quality of life. Current diagnosis is very challenging, imprecise and time-consuming using existing techniques.
However, neurotechnology provides options for assessment and communication that do not require a patient to have any movement capacity. “This has been a major area of focus for us,” says Coyle. “We have established partnerships with 16 hospitals across the UK and Ireland and received ethical approval to work with PDOC patients. In essence, we put the headset on and ask them to imagine certain movements. If we detect differences in brain activity, we can see that there is a conscious response.”
Once that response is detected the patient is put on a training programme to determine if they can learn to use the technology with the next step being to use the brain-computer interface to allow them to interact with a computer or even answer basic questions that have a yes-no response.
We are some way off enabling people to control bionic hands with their thoughts, however. Coyle explains that it is challenging to control multiple degrees of movement using non-invasively recorded brain signals such as EEG as it lacks the spatial resolution to detect the difference in finger movements or fine-arm movement, although they have recently shown some 3D arm movements can be be decoded from EEG. “However, when you look at more invasive technologies such as the insertion of sensors at the neuron or synaptic level, or on the surface of the cortex more accurate hand and arm movement decoding is possible.”
In the meantime, what’s already being done is pretty impressive. In 2016, Prof Coyle founded NeuroCONCISE to develop wearable, AI-enabled neurotechnology, and has trialled a low-cost wearable solution to collect data from a large population.
A team from NeuroCONCISE competed in Cybathlon 2016 in Zurich, the first championship for athletes with disabilities where a spinal injured pilot (Mr Owen Collumb, from Longford and living in Dublin), recruited through ISRC’s long running partnership with the National Rehabilitation Hospital in Dun Laoghaire, competed against other spinal injured pilots using only brainwaves to control avatars in a virtual race.
“We are very excited to be participating again this year in Zurich,” Coyle adds.
The technology also has applications in physical rehabilitation following stroke or other brain injuries. The physical therapy involved usually comprises repetitive movements which are aimed at both restoring physical fitness to the affected part of the body and at creating new connections to the brain so it can learn once again to control it.
In some cases this will involve the area of the brain originally responsible for those movements and in others different parts of the brain will take on new duties to replace damaged areas. This ability is known as neuroplasticity.
Physical therapy has been proven to work but it is not without its shortcomings. For example, resource constraints on health services mean that patients tend not to get enough of it. The other is that patients can lack motivation to engage in what can be simultaneously arduous and boring exercises.
The technology can deal with both these issues. “Using a brain-computer interface we can help motivate the patient by combining the therapy with a game where the exercises result in them moving a cursor or character on a screen,” says Coyle. “We can also hook the interface up to something like a robotically assisted exoskeleton.”
That’s where it can get really interesting. An exoskeleton is worn outside the body and can take the place of the physical therapist. It never gets tired and is available 24/7, thereby addressing resource issues. And with a direct connection to the patient’s brain the effectiveness of the therapy could be greatly enhanced.
The latest initiative from the ISRC is the Spatial Computing and Neurotechnology Innovation Hub (SCANi-hub) which was recently opened by Princess Anne. The hub houses cutting edge technology to determine the body and brain’s responses to stress, fatigue, achievement, awareness, error and threats in complex training and performance assessment scenarios simulated in virtual environments.
We invite collaborators who want to work with us in this space to help us develop our technologies and research capabilities further
The facility contains multiple mobile wearable EEG headsets, functional near-infrared spectroscopy brain imaging, an advanced car/flight simulator and various new AR and VR spatial computing technologies, vibrotactile stimulation suits and ultrasonic haptic interfaces. Other technologies include those that enable walking in virtual environments – virtual treadmills – as well as a state-of-the-art smartglass façade to adapt the room for various experimental situations and public engagement activities.
Potential applications for the technology extend beyond the physical into psychological therapies. “Neurofeedback therapies are becoming more popular again,” says Coyle. “We have been out in Rwanda using our technology to help people modulate their brain activity to help them address trauma symptoms.”
And new applications are constantly being found. “We invite collaborators who want to work with us in this space to help us develop our technologies and research capabilities further, or to commercialise our products,” Coyle concludes.
Intelligent Systems Research Centre at Ulster University