Skip to yearly menu bar Skip to main content


Poster

The ToMCAT Dataset

Adarsh Pyarelal · Eric Duong · Caleb Shibu · Paulo Soares · Savannah Boyd · Payal Khosla · Valeria A. Pfeifer · Diheng Zhang · Eric Andrews · Rick Champlin · Vincent Raymond · Meghavarshini Krishnaswamy · Clayton Morrison · Emily Butler · Kobus Barnard

Great Hall & Hall B1+B2 (level 1) #529

Abstract:

We present a rich, multimodal dataset consisting of data from 40 teams of three humans conducting simulated urban search-and-rescue (SAR) missions in a Minecraft-based testbed, collected for the Theory of Mind-based Cognitive Architecture for Teams (ToMCAT) project. Modalities include two kinds of brain scan data---functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG), as well as skin conductance, heart rate, eye tracking, face images, spoken dialog audio data with automatic speech recognition (ASR) transcriptions, game screenshots, gameplay data, game performance data, demographic data, and self-report questionnaires. Each team undergoes up to six consecutive phases: three behavioral tasks, one mission training session, and two collaborative SAR missions. As time-synchronized multimodal data collected under a variety of circumstances, this dataset will support studying a large variety of research questions on topics including teamwork, coordination, plan recognition, affective computing, physiological linkage, entrainment, and dialog understanding. We provide an initial public release of the de-identified data, along with analyses illustrating the utility of this dataset to both computer scientists and social scientists.

Chat is not available.