Sit-to-Stand Analysis in the Wild using Silhouettes for Longitudinal Health Monitoring

Research output: Contribution to conferenceConference Paperpeer-review

4 Citations (Scopus)

Abstract

We present the first fully automated Sit-to-Stand or Stand-to-Sit (StS) analysis framework for long-term monitoring of patients in free-living environments using video silhouettes. Our method adopts a coarse-to-fine time localisation approach, where a deep learning classifier identifies possible StS sequences from silhouettes, and a smart peak detection stage provides fine localisation based on 3D bounding boxes. We tested our method on data from real homes of participants and monitored patients undergoing total hip or knee replacement. Our results show 94.4% overall accuracy in the coarse localisation and an error of 0.026 m/s in the speed of ascent measurement, highlighting important trends in the recuperation of patients who underwent surgery.
Original languageEnglish
DOIs
Publication statusPublished - 3 Aug 2019
Event16th International Conference on
Image Analysis and Recognition
- Waterloo, Canada
Duration: 27 Aug 201929 Aug 2019
Conference number: 16
http://www.aimiconf.org/iciar19/

Conference

Conference16th International Conference on
Image Analysis and Recognition
Abbreviated titleICIAR
Period27/08/1929/08/19
Internet address

Research Groups and Themes

  • Digital Health
  • SPHERE

Fingerprint

Dive into the research topics of 'Sit-to-Stand Analysis in the Wild using Silhouettes for Longitudinal Health Monitoring'. Together they form a unique fingerprint.

Cite this