EEG-based Brain-Computer Interfaces (BCI) is a non-invasive technique used to translate brain activity to commands that control an effector (such as a computer keyboard, mouse, etc). Many patients who cannot communicate effectively, such as those who have suffered from a stroke, locked-in syndrome, or other neurodegenerative diseases, rely on BCI’s to stay connected. A few of the most common types of BCI’s modalities are P300, SSVEP, slow cortical potentials, and sensorimotor rhythms. With Wearable Sensing’s revolutionary dry EEG technology, nearly any type of BCI is possible with our research-grade signal quality. Since DSI systems are extremely easy to use and comfortable, this has opened the door to translating a wide range of BCI applications to the real- and virtual- worlds.
P300, otherwise known as the oddball paradigm, is an event-related potential (ERP) in which the brain elicits a unique response roughly 300ms after an “odd” stimulus is presented. This response can be decoded and classified in real-time for a variety of different applications.
One such use case is known as a P300 speller, in which a series of letters are flashed on a screen, and when the “target” letter pops up, our brain has the P300 response, which can then be transformed into a letter selection.
Betts Peters, Dr. Melanie Fried-Oken, and their team at Oregon Health & Science University have developed a P300 speller using the DSI-24, and have validated its functionality on subjects with Locked-In syndrome.
Steady State Visually Evoked Potentials (SSVEP) are natural responses to visual stimuli at specific frequencies. In a typical SSVEP paradigm, targets will flash at differing frequencies, anywhere from 3.5 Hz – 75 Hz, and depending on which target the subject is attending to, the brain will have a characterizable response at such specific frequency.
As shown in the video, a 12 target numbered keyboard is setup, and the subject is counting up. There is no training required, and the algorithm can correctly classify in under 1 second, in some cases.
This specific SSVEP software was developed by Wearable Sensing’s collaborater in China, Neuracle, and is available for purchase for all DSI systems. The software comes ready to use, with customizable 12-count and 40-count keyboards designed for ultra-rapid, high-accuracy classification.
Motor Imagery is a BCI technique in which the subject imagines performing a movement with a particular limb. This then alters the rhythmic activity in locations in the sensorimotor cortex that correspond to the imagined limb. The BCI can decode these signals, and translate the imagined movement into feedback in the form of cursor movements or other computer commands.
The DSI-24 was featured at an interactive art installation “Mental Work” at the Ecole Polytechnique Federale de Lausanne (EPFL) Switzerland. During the exhibit, subjects were presented with a wheel that was controlled by the subject thinking about moving either one of their arms.
Neurolutions is a medical device company developing neuro-rehabilitation solutions that seek to restore function to patients who are disabled as a result of neurological injury. The Neurolutions IpsiHand system provides upper extremity rehabilitation for chronic stroke patients leveraging brain-computer interface and advanced wearable robotics technology.
By utilizing the DSI-7, Neurolutions is able to use Motor Imagery techniques to decode a patients intent to move their finger, which then instructs the exoskeleton to physically move the finger. With repeated sessions, patients can regain control of their lost limbs.
What fires together, wires together!
Development of a flickering action video based steady state visual evoked potential triggered brain computer interface-functional electrical stimulation for a rehabilitative action observation game Journal Article
In: Technology and Health Care, vol. 28, no. S1, pp. 509-519, 2020.
In: Electronics, vol. 8, no. 12, pp. 1466, 2019.
International Conference on Applied Human Factors and Ergonomics, Springer 2019.
In: Journal of Neuroscience Methods, vol. 314, pp. 21-27, 2019.
In: Clinical Neurophysiology, vol. 129, pp. e61–e62, 2018.
In: Brain-Computer Interfaces, vol. 2, no. 4, pp. 193–201, 2015.
International Conference on Human-Computer Interaction, Springer 2009.
Please fill out the form and provide a brief description of your application so we can help match you with products that will meet your specific needs.