Wearable Sensing hosts a collective of webinars, interactive presentations, conferences, and more that aim to educate on the current state of dry electrode technology, and its integration into practical applications. The recordings of these sessions are available for free online. For information on upcoming webinars and events, see the News and Events page
Wearable Sensing has recently launched a new Dry EEG + tDCS system developed in conjunction with Soterix Medical. This system integrates 6 customizable dry electrode EEG sensors, and 2 tDCS electrodes into a single headset. Our Machine Learning (aka AI) cognitive state classification algorithm, QStates, can be used to close the loop enabling automatic stimulation in response to set cognitive states. Join us for this webinar in which we discuss this technology and its applications and demonstrate the system in action.
Wearable Sensors are enabling the collection of vast amounts of continuous physiological data in real-world environments. This webinar will describe how machine learning algorithms can be trained to classify cognitive states from these data, focusing on EEG and briefly addressing ECG, EMG, EOG, GSR, and respiration modalities. We will describe some practical applications and limitations of this approach.
Autism spectrum disorder (ASD) is a neurodevelopmental disorder that affects ~2% of the children in the USA and ~ 0.5% worldwide, and is associated with intellectual and communication disabilities, which are often accompanied by impaired emotional regulation (ER). Electroencephalography (EEG)-based neurofeedback has been successfully used to reduce ASD symptoms. Wearable Sensing’s dry electrode headsets’ comfort and ease of use has facilitated research and neurofeedback practice on children with autism, whose hypersensitivities often cause them to refuse conventional wet electrode EEG systems. In this webinar, Dr. Murat Akcakaya, associate professor at the University of Pittsburgh, will describe his team’s recently published (1) efforts to develop a BCI that performs real-time interventions to ER based on EEG signatures. Using EEG collected with Wearable Sensing’s DSI-24, Dr. Akcakaya’s BCI was able to reliably differentiate between distress and non-distress conditions in 21 individuals with ASD on a single trial basis during a game with deception. This BCI could ultimately provide real-time, automated feedback that can help guide ER control strategies during current clinical behavioral therapy or in virtual reality scenarios.
EEG-based Brain-Computer Interfaces (BCI) are maturing in research labs and some applications are ready to transition to the real or virtual worlds. Practical EEG headsets that deliver high fidelity EEG signals that are robust to movement artifacts and environmental interference are needed for successful BCI transitions. This webinar will showcase Neuracle’s High-Density EEG mobile systems and Wearable Sensing’s wireless dry electrode EEG headsets. These products provide research-grade EEG signals that can be used in real and virtual world settings. We will demonstrate these systems, show their robustness to artifacts, and present some of their real- and virtual- world BCI applications.
This webinar will introduce Wearable Sensing’s new DSI-Hybrid-EEG+fNIR system that integrates Dry Electrode EEG sensors with functional Near- InfraRed (fNIR) sensors into a single headset. The non-intrusive wireless DSI-Hybrid EEG + fNIR headset is designed for synchronized recording of the brain’s electrical activity and its hemodynamic response or blood-oxygen-level dependent (BOLD) response in ambulatory real- and virtual environments. The EEG and fNIR sensors are arranged to allow simultaneous and superimposed recordings at locations distributed across the scalp. The system produces raw data, and a machine learning algorithm fuses both signals for cognitive state classification. The webinar will describe this technology and briefly present a few applications that highlight the advantages of such bimodal EEG+fNIR sensing.
This Webinar will explore how Wearable EEG is combined with Eye-Tracking to provide relevant context information for research and neuromarketing applications. The presentation will showcase Wearable Sensing’s wireless Dry electrode EEG Sensor Interface (DSI) technology and EyeTech’s remote eyetracking solutions, both of which provide high-fidelity data that is reliable on a wide range of users and robust against motion and environmental conditions. These EEG and Eye-tracking systems are easily integrated with TEA Ergo’s Neurolab software, a neuromarketing platform that enables the synchronization of presentation material with EEG, Eyetracking, GSR, as well as Facial Coding.
This webinar will showcase Wearable Sensing’s wireless Dry Sensor Interface (DSI) technology and TEA Ergo’s CAPTIV, a platform that enables the synchronization of EEG with IMU Motion- & Eye-Tracking
In this webinar, we discuss Wearable Sensing’s dry electrode technology, the various dry electrode EEG systems offered and the various options to integrate auxiliary sensor modalities including 3D accelerometer, ECG, EMG, EOG, Respiration, skin temperature, and Galvanic Skin Response (GSR). The webinar will also introduce TEA Ergo’s CAPTIV platform, which in addition to the above modalities, enables synchronized integration with motion- and eye-tracking, video, as well as integrated data analysis.
In this webinar, we discuss Wearable Sensing’s dry electrode technology, the various dry electrode EEG systems offered and how they integrate with VR, and lastly how you can utilize EEG + VR in various research, medical, and commercial applications