Brain-Controlled Robot for Assisting Basic Upper Limb Tasks
This was my first real introduction to BCI and EEG work.
The idea was simple: can we use a hobby-grade portable EEG sensor to control a robotic arm?
I wanted to explore whether basic EEG signals can trigger simple robotic actions using thought-linked gestures. It was an early experiment, but it showed that even low-cost EEG hardware can drive meaningful control of assistive devices.
Overview
- Conference Demo: IEEE Body Sensor Networks (BSN) 2023
- Collaborators: Dr. Mohammad Arif Ul Alam (UMass Lowell), Dr. João Luís G. Rosa (University of São Paulo)
- Hardware: Muse 2 EEG headband, Raspberry Pi-based robotic arm
- Objective: Upper-limb motion control via low-cost, portable EEG interface
System Design
The system converts raw EEG signals (alpha, beta, gamma, delta, theta) into control commands using a threshold-based event detector.
A Raspberry Pi processes classified EEG inputs and drives the robotic arm’s motors through serial communication.

What I Learned & Why It Was Hard
Training a model to “detect thoughts” with a hobby EEG device is extremely difficult.
Because the Muse 2 only has four electrodes, all placed on the frontal lobe, the signals mainly reflect:
- Focus
- Attention
- Facial muscle movement
But intentional movement comes from the motor cortex, which is located at the top of the head…
far away from where the Muse collects data. This meant true “thought-based motion classification” wasn’t realistic with this device, and the model I made was really just overfitting to my brain patterns, as well as the limited scope of classifying only two distinct motions. On the other hand, blinking and jaw clench events were very easy to implement.


Impact & Future Work
Now that I better understand the relationship between brain regions and movement intention, the next step is clear:
- Move from 4 frontal electrodes → to multi-channel EEG covering the motor cortex
- Try a portable headset like the Emotiv Epoc X, which actually places electrodes over the movement areas
- Re-run the study with better spatial coverage
- Eventually test it in rehabilitation or assistive-tech contexts
In the end, I did discover a really interesting use-case for the Muse. Because the frontal lobe signals are more about focus, it’s actually much better suited for studying attention and mental engagement. Doing deep work or the idea of being in a flow state, or what athletes call “in the zone.” I even ran a small focus study around this concept, which you can check out here.
Resources
- Code Repository: GitHub – NazimBL/EEG-Controlled-Robot
- Publication: ResearchGate Article
Demo Video – EEG-Controlled Robotic Arm (BSN 2023)