>_Origin
Two hours. That's how long the first working prototype took — from zero to a real-time EEG visualization app connected to a Muse 2 headband via Web Bluetooth.
What started as a weekend experiment grew into a full multi-platform application with native iOS/Android implementations, 128 cognitive event detectors, frequency band analysis, PDF reporting, and support for three generations of Muse hardware — including the latest Muse S Athena with its completely different BLE protocol.
The project bridges neuroscience and software engineering, combining real-time signal processing with cross-platform distribution. It's used alongside my Psychology studies at SWPS University.
>_Muse BLE Protocol
The app supports two completely different Muse protocols. The Classic (Muse 2016/2, Muse S Gen2) uses per-channel EEG characteristics. The Athena (newest) multiplexes all sensors through a single characteristic with type-tagged packets.
Classic Protocol
- ▸4 separate GATT characteristics, one per channel
- ▸12-bit packed samples (20 bytes = 12 samples)
- ▸Scale: 1450.0 / 16383.0 µV per bit
Athena Protocol
- ▸Single multiplexed characteristic, type-tagged packets
- ▸14-bit LSB-first encoding, variable packet structure
- ▸Additional sensors: accelerometer, gyroscope, PPG
Athena Sensor Tags
>_Visualization Engine
Canvas-based rendering at 60 FPS. Ring buffer architecture for constant memory usage. Stacked lane display for live trace + up to 5 saved epochs.
Frequency Band Analysis
Performance
Signal Filtering
>_128 Event Detectors
Heuristic-based cognitive event detection across 13 categories. Each detector has configurable thresholds and cooldown periods to prevent double-firing. Multi-channel fusion for complex events.
Blink, jaw clench, eyebrow raise, smile
Left/right/up/down movement, wink, REM
Alpha burst, deep/light relaxation
Flow state, concentration, mind wandering
Theta burst, mindfulness, transcendence
Drowsy, microsleep, sleep spindle, K-complex
Positive/negative, surprise, frustration
Mental math, aha moment, gamma burst
Left/right hand imagery, mu suppression
Rhythm sync, melody processing
Inner speech, verbal thinking, empathy
Stress spike, anxiety pattern, fight-or-flight
Deep breath, breath hold, slow breathing
Cooldown system prevents false positives. For example, blink cooldown = 80 samples (~312ms at 256 Hz), so rapid flutter doesn't fire as multiple discrete blinks. Detection latency is 200-500ms due to FFT window requirements.
>_7+ Platforms
Shared TypeScript core across Electron and Web. Native SwiftUI and Kotlin re-implement key algorithms for full platform integration.
macOS
Electron DMG + ZIP — Universal (Apple Silicon + Intel)
Windows
Electron NSIS — Installer (.exe)
Linux
AppImage + DEB — Ubuntu, Fedora, Arch
iOS / iPadOS
SwiftUI native — 5,140 LOC, 43 Swift files
watchOS
Companion app — Glanceable brain state
tvOS / visionOS
SwiftUI target — Spatial display ready
Android
Capacitor + Kotlin — 4,124 LOC, BLE native
Web PWA
Vite + Service Workers — Installable, offline
4,501
TypeScript LOC
5,140
Swift LOC
4,124
Kotlin LOC
>_Demo System
Six synthetic EEG presets let users explore the app without a headband. Each preset simulates realistic brain activity with configurable alpha, beta, theta, and delta power levels, blink artifacts, and noise.
Relaxed
Eyes closed, calm
Music
Listening to music
Focus
Concentration task
Meditation
Deep meditation
Drowsy
Falling asleep
Anxious
Stress, agitation
>_Export & Internationalization
PDF & CSV Export
Dark-themed PDF reports generated via jsPDF with 3x upscaled canvas rendering for print quality. CSV export with sample, TP9, AF7, AF8, TP10 columns — compatible with MATLAB, Python, and R analysis tools.
11 Languages
Full i18n with 300+ translated keys per locale. Dynamic UI rebuilding on language switch — all data-i18n elements, channel descriptions, and event labels update instantly.
>_By the Numbers
17,625+
Lines of Code
128
Event Detectors
7+
Platforms
256
Hz Sample Rate
11
Languages
5
Frequency Bands
13
Event Categories
6
Demo Presets