Multimodal Human Activity Recognition At Home
Riku Arakawa · Prasoon Patidar · Yuvraj Agarwal · Mayank Goel
Abstract
We propose to develop [anonymous dataset], a large-scale, multimodal human activity recognition (HAR) dataset collected from real-world kitchen environments. The dataset enables daily activity prediction and modeling of procedural behaviors with diverse sensors, including ambient video, thermal imaging, LiDAR, radar, and smartwatch data. Unlike prior datasets constrained to lab settings or unimodal sensing, our dataset supports research in long-term user behavior modeling, cross-modal transfer, and privacy-preserving assistive AI applications. The dataset is designed to support predictive, generative, and language-based HAR tasks, pushing the boundary of context-aware AI in naturalistic home contexts.
Chat is not available.
Successful Page Load