Automatically recognizing a broad spectrum of human activities is key to realizing many compelling applications in health, personal assistance, human-computer interaction and smart environments. However, in real-world settings, approaches to human action perception have been largely constrained to detecting mobility states, e.g., walking, running, standing. In this work, we explore the use of inertial-acoustic sensing provided by off-the-shelf commodity smartwatches for detecting activities of daily living (ADLs). We conduct a semi-naturalistic study with a diverse set of 15 participants in their own homes and show that acoustic and inertial sensor data can be combined to recognize 23 activities… Read more
Recent work in Automated Dietary Monitoring (ADM) has shown promising results in eating detection by tracking jawbone movements with a proximity sensor mounted on a necklace. A significant challenge with this approach, however, is that motion artifacts introduced by natural body movements cause the necklace to move freely and the sensor to become misaligned. In this paper, we propose a different but related approach: we developed a small wireless inertial sensing platform and perform eating detection by mounting the sensor directly on the underside of the jawbone… Read more
Over the last decade, advances in mobile technologies have enabled the development of intelligent systems that attempt to recognize and model a variety of health-related human behaviors. While automated dietary monitoring based on passive sensors has been an area of increasing research activity for many years, much less attention has been given to tracking fluid intake. In this work, we apply an adaptive segmentation technique on a continuous stream of inertial data captured with a practical, off-the-shelf wrist-mounted device to detect fluid intake gestures passively… Read more