A NeuroGaming application suite that translates head motion into cursor movements using the Emotiv Cortex API. This project includes a PyQt6 Configuration UI, a direct OS mouse controller, and a WebSocket-based streaming system for web visualization.
- Configuration UI: A full desktop UI to configure credentials, sensitivity, deadzones, and mental command mappings. Settings are persistently saved to
config.json. - Head Tracking: Translates headset gyroscope/quaternion data into smooth mouse cursor movement.
- Granular Mental Commands: Maps mental commands (e.g., "Push") to specific mouse actions (
LeftPress,RightRelease,LeftAutoRelease). - WebSocket Streaming: Broadcasts cursor position to a web client for visualization.
- Cross-Platform: Supports Windows, macOS, and Linux.
- Python 3.8+
- Emotiv Headset (Insight, EPOC+, etc.)
- Cortex App installed and running (for real device connection).
- Emotiv Credentials (Client ID and Client Secret).
- (Optional) A working Display/Window Server for the PyQt6 UI.
It is recommended to use a virtual environment.
python3 -m venv venv
source venv/bin/activate
pip install -r requirements.txtpython -m venv venv
venv\Scripts\activate
pip install -r requirements.txtThe easiest way to configure and run the application is through the new desktop Dashboard.
python ui.pyFrom the UI, you can:
- Enter your Cortex Client ID and Client Secret.
- Toggle Simulation Mode (test without a headset) or OS Mouse Movement (move the real cursor).
- Adjust Motion sliders (Sensitivity, Deadzone, Smoothing).
- Map Mental Commands (Push, Pull, Drop, Lift) to specific Granular Actions (
LeftPress,LeftRelease,RightPress,RightRelease,LeftAutoRelease). - Start / Stop the WebSocket Transmitter securely in the background.
(All settings are automatically saved to config.json locally).
If you prefer to bypass the UI, you can run the core component directly. It will automatically load your saved config.json.
Run with Real Headset (Default): Ensure your Cortex App is running and headset is connected.
python mouse_transmitter.pyRun in Simulation Mode: Test the system without a headset (generates simulated movement patterns).
python mouse_transmitter.py --simulateA real-time visualizer that displays the cursor movement in your browser.
- Start the Mouse Transmitter (via the UI or command line).
- Open the
web_visualizer/index.htmlfile in your web browser.- macOS:
open web_visualizer/index.html - Windows: Double-click the file or
start web_visualizer/index.html
- macOS:
The blue circle will verify the connection and begin moving based on your head tracking data.
ui.py: (New) PyQt6 Dashboard to configure credentials, motion, mappings, and launch the server.config_manager.py: (New) Handles loading/saving settings toconfig.json.mouse_transmitter.py: WebSocket server that streams head-tracking coordinates.web_visualizer/: Web client for visualization.index.html: Entry point.script.js: WebSocket logic.style.css: Visual styling.
mouse_controller.py: Legacy script for direct OS mouse control.cortex.py: Cortex API wrapper.neurogaming/: Core logic for signal processing and movement calculation.
- Address already in use: If the server fails to start, ensure no other instance is running on port
8765. - Chrome not found: The visualizer is a static HTML file; it works in any modern browser (Safari, Firefox, Edge, Chrome).
- Dependencies: Ensure
websocketsis installed (pip install websockets).