Demo: Standardized XR Tools for Capturing Brain and Behavior Data in Context
Abstract
Brain-body foundation models that integrate neural activity with behavioral data are necessary to fully capture the dynamic contexts that shape human cognition. Yet, access to behavioral signals such as movement, gaze, posture, and facial expressions has been historically limited in neuroscience research, and the proper interpretation of these signals depends critically on context. Progress toward brain-body foundation models will thus benefit from tools capable of capturing this comprehensive neural, behavioral, and contextual data at scale. Extended Reality (XR) technologies offer researchers a unique opportunity to record brain, behavior, and context simultaneously in naturalistic yet controlled environments. However, the technical complexity of XR development continues to present a barrier to adoption in many labs, while the lack of common protocols for collecting XR data makes it difficult to combine these rich datasets across research groups. SilicoLabs addresses these challenges with LABO: a no/low-code platform that enables researchers to design interactive experiences that simulate the real world and capture high-fidelity behavioral and biosensor data in a standardized, hardware-independent format. In addition to collecting detailed event-related data, LABO supports tracking of hands, face, body, and gaze from XR headsets, while integrating with research-grade systems for brain recording (e.g. Wearable Sensing dry electrode EEG), eye-tracking (e.g. Pupil Labs NeonXR), and movement capture (e.g. Sony mocopi). Furthermore, LABO allows for these signals to be used as active inputs that influence and control elements of the task environment, demonstrating the potential for dynamic, closed-loop experimental paradigms. In this demo, participants will be able to interact with the LABO software as well as the Wearable Sensing DSI-VR300 mobile EEG device created specifically for XR applications. Participants will be invited to try a short immersive XR experience while controlling elements of their virtual environment with the connected EEG system. The real-time neural and behavioral data streams will be simultaneously displayed in LABO’s integrated data viewer for both the participant and onlookers, giving live insight into how these multimodal signals can be collected in embodied contexts. The goal of the demo will be to introduce participants to both LABO’s capabilities and advances in simultaneous XR and EEG recording technology made possible by Wearable Sensing’s DSI-VR300 device, which together lower the barrier to entry for XR-based neuroscience research. We believe attendees will be particularly interested in the simultaneous data streams afforded by XR data and rarely available together in traditional neuroscience experiments. Finally, LABO’s data outputs are being designed to interface with torch-brain, an open‑source Python library that supports the training and evaluation of deep learning models for neuroscience. A core focus of this integration is the development of standardized XR data formats that promote reproducibility, open science, and interoperability, thereby enabling researchers to capture brain-body data at scale. By experiencing how LABO makes such data accessible and standardized, participants will see first-hand how XR experiments can directly contribute to building shared multimodal datasets for training large-scale brain-body models that span devices, populations, and contexts.
Websites:
https://silicolabs.ca/
https://wearablesensing.com/