Abstract
Under healthcare research, eating activity detection and recognition have been studied for many years. Most of the previous approaches rely on body worn sensors for eating behavior detection. However, measurement errors from these sensors will largely reduce the tracking accuracy in estimating velocity and acceleration. To avoid this problem, we utilize Microsoft Kinect to capture skeleton motions of eating and drinking behaviors. In this paper we introduce a moving average method to remove the noise in relative distances of the captured joint positions so that eating activity can be segmented into feeding and non-feeding frames. In order to identify different eating patterns, eating and drinking behavior recognition is performed based on the features extracted from the resulting feeding periods. The experiments are evaluated on our collected eating and drinking action dataset, and we also change the distance between Kinect and subject to test the robustness of our approach. The results achieve better detection and recognition performance compared with other approaches. This pioneer work of our eating action behavior analysis can lead to many potential applications such as the development of a Web system to facilitate people to share and search their eating and drinking actions as well as carrying out intelligent analysis to provide suggestions.
Original language | English |
---|---|
Pages (from-to) | 1343-1358 |
Number of pages | 16 |
Journal | World Wide Web |
Volume | 22 |
Issue number | 3 |
DOIs | |
Publication status | Published - 15 May 2019 |
Bibliographical note
Publisher Copyright:© 2018, Springer Science+Business Media, LLC, part of Springer Nature.
Keywords
- Eating action recognition
- Microsoft Kinect
- Self-feeding detection
- Skeleton motions