Editor’s Note
This article is written from a first-person perspective as part of the OneResearch (1R) High School Outreach Program. The content reflects the author’s individual research experiences and interpretations and is intended to showcase early-stage scientific exploration by student researchers.
Background
Amyotrophic lateral sclerosis (ALS) affects an estimated 1.9 – 6 out of every 100,000 people worldwide and is characterized by progressive degeneration of motor neurons, ultimately resulting in full-body paralysis and loss of speech. ALS has a devastating personal and societal impact, with the average survival time post-diagnosis being just 2 to 5 years. While assistive technologies like eye-tracking systems and speech synthesizers have helped patients maintain communication, full-body mobility tools such as exoskeletons and neural control systems remain largely inaccessible.
Epilepsy, on the other hand, affects over 50 million people globally, according to the World Health Organization. Out of these individuals, around 80% live in low to middle-income countries. The unpredictability of seizures can pose life-threatening risks and severely impair quality of life. While treatment through antiepileptic drugs exists, nearly 30% of patients are drug-resistant, leaving them vulnerable to sudden seizure onset. Current seizure alert systems are often reactive, not predictive. They rely on post-ictal signals such as convulsions or muscle contractions rather than underlying neural precursors.
These two conditions, one progressive, one episodic, both represent critical gaps in biomedical support systems: a need for more intuitive, personalized, and cost-effective neurological interfaces.
Design & Implementation
Brain Computer Interfaces (BCIs)
A brain-computer interface (BCI) is a system that allows direct communication between the brain, or associated electrical activity, and an external device, such as a computer or robotic system. BCIs are designed to bypass traditional neuromuscular pathways, enabling individuals to interact with technology using only brain signals or subtle bioelectrical inputs.
In my system, for example, the BCI identifies specific patterns related to eye blinks and muscle twitches. These blinks produce distinct electrical signals that are detected by ECG electrodes placed near the forehead and temples. The raw signals are streamed to a microcontroller, then passed into a Python-based environment where a trained Support Vector Machine (SVM) classifier determines whether a blink has occurred. If a blink is detected, the BCI sends a command to activate a robotic arm, allowing the user to control movement through neural input alone.
Experimental Design
I began my research journey in 7th grade by testing the capabilities of a commercial EEG device. Initial experiments used 20–80 second interval training periods to distinguish between “blink” and “not blinking” states through time-segmented neural input. These signals, specifically artifacts related to eye and slight muscle movement, were processed by a trained classifier and translated into robotic arm motion, enabling basic control through voluntary blinks and facial twitches.
In 8th grade, I expanded this framework into a more robust, early-stage brain-computer interface (BCI) system for mobility and seizure detection. BCIs interpret brain activity and convert it into executable commands, offering new pathways for communication and control without traditional neuromuscular input. To prototype a BCI using accessible materials, I employed consumer-grade ECG sensors placed at frontal and temporal electrode positions (T9, T10, AF7, and AF8) to capture blink-associated electrical signals. Although these sensors are typically used for cardiac monitoring, they were repurposed to provide low-cost alternatives to clinical EEG systems and embedded into a 3D-printed head frame for consistent placement and reduced motion artifacts.
Signal acquisition was handled by a microcontroller and ADC, with data streamed to a Python-based processing environment. I developed a training dataset of blink vs. no-blink intervals, applying basic noise filtering and extracting features such as peak amplitude, and time-domain signal changes. A Support Vector Machine (SVM) classifier was trained using an 80/20 train-test split, achieving classification accuracies exceeding 97% across trials. The model’s output activated a servo-driven robotic arm in real time, converting neural input into directional movement.
Building on this system, I adapted the architecture to begin exploring seizure prediction using EEG-style data. By extending the time windows and incorporating frequency-based features, the model was trained to identify pre-ictal patterns associated with epilepsy. While this application remained in early-stage testing, it demonstrated the potential for low-cost, ML-assisted BCIs to enhance both mobility and early neurological event detection for underserved populations.
Broader Impact
Neurological disorders like ALS and epilepsy disproportionately affect individuals in low-resource settings, where access to advanced diagnostics and assistive technologies is limited or nonexistent. Clinical-grade EEG systems can cost upwards of $10,000, and mobility-support devices like robotic exoskeletons often exceed $50,000. This puts them far beyond reach for the average patient. As a result, millions of individuals are left without the tools needed to communicate, move, or receive real-time neurological care.
This project demonstrates the potential of low-cost, machine learning–assisted brain-computer interfaces to bridge that gap. By leveraging sub-$10 ECG sensors, open-source software, and 3D-printed hardware, the system offers a scalable alternative to traditional neurotechnology. Studies have shown that brain-computer interface (BCI) systems can significantly improve functional independence in individuals with motor impairments, with some interventions leading to improvements in upper limb motor function and daily task performance of up to 70%. Additionally, seizure forecasting using wearable devices has demonstrated predictive accuracy with area under the curve (AUC) scores reaching 0.77, offering the potential to reduce seizure-related injuries for 4/6 patients through timely risk awareness.
The dual functionality of this BCI prototype, enabling robotic movement via neural input and identifying pre-ictal EEG-like patterns, addresses critical needs in response delays, safety, and improvements to day-to-day autonomy for individuals living with epilepsy or paralysis.
Looking ahead, this framework could be further scaled into wearable headsets for daily use, integrated with cloud-based monitoring platforms, or expanded to detect a wider range of neural events. With further development, this research could serve as the foundation for accessible, AI-powered neurotools that democratize healthcare access for millions worldwide.