Demo: Lab Smarter with PLASMA and MotionSenseHRV
Abstract
Cognitive science is shifting from unimodal to multimodal approaches. Yet the common practice of pairing each sensor with its own app scales poorly, leading to rapidly growing system complexity. This setup requires human operators to manually initiate data collection in sequence for all sensors that are prone to human error resulting in potential data loss. Also, the setup is labor intensive as operators need frequent switching between task windows during the collection process. Finally, precise temporal alignment across sensors with heterogeneous sampling rates, transmission methods, and storage protocols continues to be a major challenge.
In this demo, we present 1) PLASMA: an unified instrumentation platform enabling a smart lab experience, and 2) MotionSenseHRV: an open-source wristband design that collects high-frequency raw IMU and PPG data and broadcasts Bluetooth beacons to facilitate time-synchronization across distributed sensor storage points,
We begin with PLASMA: Platform for LSL-based Acquisition of Sensor Metrics and Analytics that provides a unified instrumentation and visualization interface for a multimodal sensor suite. The app, built upon the Lab streaming layer (LSL), enables easy time synchronization in post-process. The app also features a centralized dashboard offering real time visualization of individual sensor status and readings, providing a task-centered interface that minimizes operator workload throughout individual stages of the data collection. The dashboard also alerts the user of a sensor error during the collection process to minimize the impact of missing data. The system supports a growing list of sensors including Pupil Labs (Gaze, IMU, and eye events), Shimmer GSR, Bitalino ECG, as well interfacing other apps like mBrainTrain and OBS studio.
We also showcase MotionSenseHRV v4 - an open-source wristband sensor design that provides raw recording of motion and PPG data. The wristband sensor features a 4GB data storage for up to 30 days of recording capability. The onboard sensors consist of a 9-axis inertial measurement unit (IMU) and a photoplethysmography sensor (PPG) that measures blood flow using green and infrared wavelengths. The device computes physical-activity summary statistics and broadcasts them via Bluetooth Low Energy (BLE) packets that also serve as time-synchronization beacons. With its native integration with PLASMA, such sensor metrics can be visualized in the real time dashboard.
In the demo, participants will be invited to wear a pair of MotionSenseHRV wristbands and try out PLASMA software to instrument a broad spectrum of sensors we bring on-site. Participants will initiate/stop a multi-sensor streaming/recording session in our software. While the session is running, the participant will browse through a real-time visualization dashboard. Besides the aforementioned sensors, the participant will also be able to see a live skeleton estimation that is computed real time through the camera sensor attached to the system. A custom sensor fault is built into the demo software to represent a real scenario that may happen in lab settings.