>_ case study

EEG Visualizer

Real-time brain activity visualization for InteraXon Muse headbands. Four EEG channels at 256 Hz, frequency band analysis, 128 cognitive event detectors, PDF reports - running on every major platform from a single codebase.

Desktop | Electron 28 + esbuildiOS | SwiftUI + CoreBluetoothAndroid | Kotlin + CapacitorWeb | PWA + Service WorkersBLE | muse-js + nativeExport | jsPDF + CSVi18n | 11 languagesEvents | 128 detectors

>_Origin

Two hours. That's how long the first working prototype took - from zero to a real-time EEG visualization app connected to a Muse 2 headband via Web Bluetooth.

What started as a weekend experiment grew into a full multi-platform application with native iOS/Android implementations, 128 cognitive event detectors, frequency band analysis, PDF reporting, and support for three generations of Muse hardware - including the latest Muse S Athena with its completely different BLE protocol.

The project bridges neuroscience and software engineering, combining real-time signal processing with cross-platform distribution. It's used alongside my Psychology studies at SWPS University.

>_Muse BLE Protocol

The app supports two completely different Muse protocols. The Classic (Muse 2016/2, Muse S Gen2) uses per-channel EEG characteristics. The Athena (newest) multiplexes all sensors through a single characteristic with type-tagged packets.

eeg channels (256 Hz, 4ch)
TP9Left ear
AF7Left forehead
AF8Right forehead
TP10Right ear

Classic Protocol

  • 4 separate GATT characteristics, one per channel
  • 12-bit packed samples (20 bytes = 12 samples)
  • Scale: 1450.0 / 16383.0 µV per bit

Athena Protocol

  • Single multiplexed characteristic, type-tagged packets
  • 14-bit LSB-first encoding, variable packet structure
  • Additional sensors: accelerometer, gyroscope, PPG

Athena Sensor Tags

0x11 EEG 4ch0x12 EEG 8ch0x47 ACC+GYRO0x34 PPG 4ch0x35 PPG 8ch0x36 PPG 16ch0x88 Battery

>_Visualization Engine

Canvas-based rendering at 60 FPS. Ring buffer architecture for constant memory usage. Stacked lane display for live trace + up to 5 saved epochs.

ui layout
┌────────────────────────────────────────────────┐ TOOLBAR Connect | Demo | View | Export | Lang ├──────────────────────────────────┬─────────────┤ ≈≈≈ TP9 Legend ≈≈≈ AF7 Epochs ≈≈≈ AF8 Filters ≈≈≈ TP10 Events └──────────────────────────────────┴─────────────┘

Frequency Band Analysis

Delta1-4 HzDeep sleep, unconscious processing
Theta4-8 HzMeditation, drowsiness, memory
Alpha8-13 HzRelaxation, eyes closed, calm
Beta13-30 HzFocus, concentration, alertness
Gamma30-50 HzHigher cognition, perception

Performance

60 FPS sustained rendering with 4 channels + 5 epochs
~400 KB total memory (ring buffers, 30s window)
5-8% CPU on Apple M1 / Intel i5

Signal Filtering

Raw Unfiltered signalNotch 50/60 Hz grid hum removalBandpass 1-40 Hz (standard EEG range)

>_128 Event Detectors

Heuristic-based cognitive event detection across 13 categories. Each detector has configurable thresholds and cooldown periods to prevent double-firing. Multi-channel fusion for complex events.

😊Face / Muscle10

Blink, jaw clench, eyebrow raise, smile

👁Eyes10

Left/right/up/down movement, wink, REM

🧘Relaxation4

Alpha burst, deep/light relaxation

🎯Focus7

Flow state, concentration, mind wandering

🕊Meditation6

Theta burst, mindfulness, transcendence

😴Sleep5

Drowsy, microsleep, sleep spindle, K-complex

😍Emotion7

Positive/negative, surprise, frustration

🧩Cognitive7

Mental math, aha moment, gamma burst

Motor6

Left/right hand imagery, mu suppression

🎵Music / Audio5

Rhythm sync, melody processing

🗣Social4

Inner speech, verbal thinking, empathy

💢Stress4

Stress spike, anxiety pattern, fight-or-flight

🌬Breathing4

Deep breath, breath hold, slow breathing

Cooldown system prevents false positives. For example, blink cooldown = 80 samples (~312ms at 256 Hz), so rapid flutter doesn't fire as multiple discrete blinks. Detection latency is 200-500ms due to FFT window requirements.

>_7+ Platforms

Shared TypeScript core across Electron and Web. Native SwiftUI and Kotlin re-implement key algorithms for full platform integration.

💻

macOS

Electron DMG + ZIP - Universal (Apple Silicon + Intel)

🖥

Windows

Electron NSIS - Installer (.exe)

🐧

Linux

AppImage + DEB - Ubuntu, Fedora, Arch

📱

iOS / iPadOS

SwiftUI native - 5,140 LOC, 43 Swift files

watchOS

Companion app - Glanceable brain state

📺

tvOS / visionOS

SwiftUI target - Spatial display ready

📱

Android

Capacitor + Kotlin - 4,124 LOC, BLE native

🌐

Web PWA

Vite + Service Workers - Installable, offline

4,501

TypeScript LOC

5,140

Swift LOC

4,124

Kotlin LOC

>_Demo System

Six synthetic EEG presets let users explore the app without a headband. Each preset simulates realistic brain activity with configurable alpha, beta, theta, and delta power levels, blink artifacts, and noise.

Relaxed

Eyes closed, calm

α 35 µVβ 5θ 12

Music

Listening to music

α 20β 12θ 15

Focus

Concentration task

α 10β 20θ 6

Meditation

Deep meditation

α 25β 4θ 18

Drowsy

Falling asleep

α 12β 3θ 25

Anxious

Stress, agitation

α 8β 25θ 8

>_Export & Internationalization

PDF & CSV Export

Dark-themed PDF reports generated via jsPDF with 3x upscaled canvas rendering for print quality. CSV export with sample, TP9, AF7, AF8, TP10 columns - compatible with MATLAB, Python, and R analysis tools.

PDFCSVPNG

11 Languages

Full i18n with 300+ translated keys per locale. Dynamic UI rebuilding on language switch - all data-i18n elements, channel descriptions, and event labels update instantly.

PLENDEFRESPTITRUZHJAKO

>_By the Numbers

17,625+

Lines of Code

128

Event Detectors

7+

Platforms

256

Hz Sample Rate

11

Languages

5

Frequency Bands

13

Event Categories

6

Demo Presets

>_Security

Electron: contextIsolation enabled, nodeIntegration disabled, IPC via preload bridge
Permission handler: only Bluetooth access allowed, all others denied
IPC validation: PDF data URI format checked, 500 MB size cap, no directory traversal
XSS prevention: all dynamic content via textContent / createTextNode, no innerHTML
iOS: only NSBluetooth entitlements requested, no camera/location/contacts access
← back to raba.plLinkedIn post →