Skip to content

Energy expenditure estimation using visual and inertial sensors

Research output: Contribution to journalArticle

Standard

Energy expenditure estimation using visual and inertial sensors. / Tao, Lili; Burghardt, Tilo; Mirmehdi, Majid; Damen, Dima; Cooper, Ashley; Camplani, Massimo; Hannuna, Sion; Paiement, Adeline; Craddock, Ian.

In: IET Computer VIsion, Vol. 12, No. 1, 01.02.2018, p. 36-47.

Research output: Contribution to journalArticle

Harvard

APA

Vancouver

Author

Tao, Lili ; Burghardt, Tilo ; Mirmehdi, Majid ; Damen, Dima ; Cooper, Ashley ; Camplani, Massimo ; Hannuna, Sion ; Paiement, Adeline ; Craddock, Ian. / Energy expenditure estimation using visual and inertial sensors. In: IET Computer VIsion. 2018 ; Vol. 12, No. 1. pp. 36-47.

Bibtex

@article{50443c6352334a60ac6035fb7a8d9de6,
title = "Energy expenditure estimation using visual and inertial sensors",
abstract = "Deriving a person's energy expenditure accurately forms the foundation for tracking physical activity levels across many health and lifestyle monitoring tasks. In this study, the authors present a method for estimating calorific expenditure from combined visual and accelerometer sensors by way of an RGB-Depth camera and a wearable inertial sensor. The proposed individual-independent framework fuses information from both modalities which leads to improved estimates beyond the accuracy of single modality and manual metabolic equivalents of task (MET) lookup table based methods. For evaluation, the authors introduce a new dataset called SPHERE_RGBD + Inertial_calorie, for which visual and inertial data are simultaneously obtained with indirect calorimetry ground truth measurements based on gas exchange. Experiments show that the fusion of visual and inertial data reduces the estimation error by 8 and 18{\%} compared with the use of visual only and inertial sensor only, respectively, and by 33{\%} compared with a MET-based approach. The authors conclude from their results that the proposed approach is suitable for home monitoring in a controlled environment.",
keywords = "Digital Health",
author = "Lili Tao and Tilo Burghardt and Majid Mirmehdi and Dima Damen and Ashley Cooper and Massimo Camplani and Sion Hannuna and Adeline Paiement and Ian Craddock",
year = "2018",
month = "2",
day = "1",
doi = "10.1049/iet-cvi.2017.0112",
language = "English",
volume = "12",
pages = "36--47",
journal = "IET Computer VIsion",
issn = "1751-9632",
publisher = "Institution of Engineering and Technology (IET)",
number = "1",

}

RIS - suitable for import to EndNote

TY - JOUR

T1 - Energy expenditure estimation using visual and inertial sensors

AU - Tao, Lili

AU - Burghardt, Tilo

AU - Mirmehdi, Majid

AU - Damen, Dima

AU - Cooper, Ashley

AU - Camplani, Massimo

AU - Hannuna, Sion

AU - Paiement, Adeline

AU - Craddock, Ian

PY - 2018/2/1

Y1 - 2018/2/1

N2 - Deriving a person's energy expenditure accurately forms the foundation for tracking physical activity levels across many health and lifestyle monitoring tasks. In this study, the authors present a method for estimating calorific expenditure from combined visual and accelerometer sensors by way of an RGB-Depth camera and a wearable inertial sensor. The proposed individual-independent framework fuses information from both modalities which leads to improved estimates beyond the accuracy of single modality and manual metabolic equivalents of task (MET) lookup table based methods. For evaluation, the authors introduce a new dataset called SPHERE_RGBD + Inertial_calorie, for which visual and inertial data are simultaneously obtained with indirect calorimetry ground truth measurements based on gas exchange. Experiments show that the fusion of visual and inertial data reduces the estimation error by 8 and 18% compared with the use of visual only and inertial sensor only, respectively, and by 33% compared with a MET-based approach. The authors conclude from their results that the proposed approach is suitable for home monitoring in a controlled environment.

AB - Deriving a person's energy expenditure accurately forms the foundation for tracking physical activity levels across many health and lifestyle monitoring tasks. In this study, the authors present a method for estimating calorific expenditure from combined visual and accelerometer sensors by way of an RGB-Depth camera and a wearable inertial sensor. The proposed individual-independent framework fuses information from both modalities which leads to improved estimates beyond the accuracy of single modality and manual metabolic equivalents of task (MET) lookup table based methods. For evaluation, the authors introduce a new dataset called SPHERE_RGBD + Inertial_calorie, for which visual and inertial data are simultaneously obtained with indirect calorimetry ground truth measurements based on gas exchange. Experiments show that the fusion of visual and inertial data reduces the estimation error by 8 and 18% compared with the use of visual only and inertial sensor only, respectively, and by 33% compared with a MET-based approach. The authors conclude from their results that the proposed approach is suitable for home monitoring in a controlled environment.

KW - Digital Health

UR - http://www.scopus.com/inward/record.url?scp=85041116428&partnerID=8YFLogxK

U2 - 10.1049/iet-cvi.2017.0112

DO - 10.1049/iet-cvi.2017.0112

M3 - Article

VL - 12

SP - 36

EP - 47

JO - IET Computer VIsion

JF - IET Computer VIsion

SN - 1751-9632

IS - 1

ER -