<p>Camera-based remote photoplethysmography (rPPG) technology has shown a promising future in contact-free cardiac and other smart health applications. The rPPG technology typically requires facial videos as a source input, which may lead to identity-privacy concerns. Facial videos are sensitive and contain subjects' identifiable appearance features. Coupled with the health information potentially revealed by rPPG techniques, the compounding sensitivity has been a major obstacle to encouraging the sharing of facial rPPG video datasets in the research community to foster the advancement of the field. This paper investigates a suite of anonymization transforms that remove the identifiable appearance features in facial videos and retain the physiological signals for rPPG analysis. After the transformation, the facial videos are de-identified and may be shared in public with little risk of identity-privacy leakage. The proposed algorithm offers tunable options to balance the physiological fidelity and the identity-protecting strength to meet different levels of privacy requirements. A human subject study has been carried out to understand -- both qualitatively and quantitatively -- the perceived strength and efficacy of privacy protection by these anonymization techniques in de-identifying the facial videos and maintaining the physiological signals.</p>
<p>Camera-based remote photoplethysmography (rPPG) technology has shown a promising future in contact-free cardiac and other smart health applications. The rPPG technology typically requires facial videos as a source input, which may lead to identity-privacy concerns. Facial videos are sensitive and contain subjects' identifiable appearance features. Coupled with the health information potentially revealed by rPPG techniques, the compounding sensitivity has been a major obstacle to encouraging the sharing of facial rPPG video datasets in the research community to foster the advancement of the field. This paper investigates a suite of anonymization transforms that remove the identifiable appearance features in facial videos and retain the physiological signals for rPPG analysis. After the transformation, the facial videos are de-identified and may be shared in public with little risk of identity-privacy leakage. The proposed algorithm offers tunable options to balance the physiological fidelity and the identity-protecting strength to meet different levels of privacy requirements. A human subject study has been carried out to understand -- both qualitatively and quantitatively -- the perceived strength and efficacy of privacy protection by these anonymization techniques in de-identifying the facial videos and maintaining the physiological signals.</p>
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.