Remote measurement of vital signs (including but not limited to heart rate, heart rate variability and respiratory rate) has lately gained considerable attention. Such methods mostly depend on the remote estimation of the photo-plethysmographic information that could be possible through the tiny skin color changes due to the periodic oxygen saturation deviations in the blood vessels. This information carries a potential usage for physical and psychological diagnosis.
With the introduction of experimental methods, it is proven that the extraction of such signals could be possible under controlled environments. In an attempt to take this technology to real life, we investigated towards a real-time and head movement resilient method that could be applied under practical scenarios. Most frequency based prior methods work reliably when the signal to noise ratio is high; however, they deviate from the actual values strongly whenever the conditions slightly deteriorate.
Facial appearance modelling that has been optimized by FaceReader has supplied us with a strong starting point. We have used the facial landmarks for obtaining head pose independent region of interests and hence a stable signal acquisition is achieved. We have also introduced a time domain periodic component estimation using novel signal processing methods. To this end, we have also collected a ground truth dataset for evaluating the performance of the proposed method.
The signals below show the processing steps and how the signal is denoised and detrended for frequency spectrum analysis.
One example of how the method performs on a real time scenario where the subject is observed after an active movement session (running for 5 min). The heart rate drops from 130 to 60 bpm in almost one and a half minutes. The red curve is obtained with our method and follows closely the ground truth signal that has been obtained using a touch sensor.
We believe that by utilizing multimodal information more insight into people could be gained. How people really feel in certain situations and if their visual appearance correlate with their psychological/physical condition? Facial expressions can be merged with the physical measures to capture more information.
 H. Emrah Tasli, Amogh Gudi, Marten den Uyl; “Integrating Remote PPG in Facial Expression Analysis Framework“; Demo Paper at International Conference on Multimodal Interaction (ICMI) 2014.
 H. Emrah Tasli, Amogh Gudi, Marten den Uyl; “Remote PPG based Vital Sign Measurement using Adaptive Regions“; International Conference on Image Processing (ICIP) 2014.