Who Goes There? Exploiting Silhouettes and Wearable Signals for Subject Identification in Multi-Person Environments

Research output: Contribution to conferenceConference Paperpeer-review

Abstract

The re-identification of people in private environments is a rather complicated task, not only from a technical standpoint but also for the ethical issues connected to it. The lack of a privacy-sensitive technology to monitor specific individuals prevents the uptake of assistive systems, for example in Ambient Assisted Living and health monitoring applications. Our approach adopts a deep learning multimodal framework to match silhouette video clips and accelerometer signals to identify and re-identify the subjects of interest within a multi-person environment. Brief sequences, which may be as short as only 3 seconds, are encoded within a latent space where simple Euclidean distance can be used to discriminate the matching. Identities are only revealed in terms of accelerometer carriers, and the use of silhouettes instead of RGB signals helps to ring-fence privacy concerns. We train our method on the SPHERE Calorie Dataset, for which we show an average area under the ROC curve of 76.3%. We also propose a novel triplet loss for which we demonstrate improving performances and convergence speeds.
Original languageEnglish
Publication statusAccepted/In press - 20 Aug 2019
EventInternational Conference on Computer Vision 2019: Workshop on Computer Vision for Physiological Measurement - Seoul, Korea, Democratic People's Republic of
Duration: 27 Oct 20192 Nov 2019

Workshop

WorkshopInternational Conference on Computer Vision 2019
Abbreviated titleICCV
Country/TerritoryKorea, Democratic People's Republic of
CitySeoul
Period27/10/192/11/19

Structured keywords

  • Digital Health
  • SPHERE

Fingerprint

Dive into the research topics of 'Who Goes There? Exploiting Silhouettes and Wearable Signals for Subject Identification in Multi-Person Environments'. Together they form a unique fingerprint.

Cite this