A team of scientists led by the University of Washington has devised a new method to track health using device cameras. Using just a camera on a smartphone or computer. A person can take their pulse and respiration signal from a real-time video of their face.
The system uses machine learning to capture subtle changes in how light reflects a person’s face, which is correlated with changing blood flow. These changes are then converted into both pulse and respiration rates.
Lead author Xin Liu, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering, said, “Machine learning is pretty good at classifying images. If you give it a series of photos of cats and then tell it to find cats in other images, it can do it. But for machine learning to be helpful in remote health sensing, we need a system that can identify the region of interest in a video that holds the strongest source of physiological information — pulse, for example — and then measure that over time.”
“Every person is different. So this system needs to be able to adapt to each person’s unique physiological signature quickly, and separate this from other variations, such as what they look like and what environment they are in.”
At the initial phase, scientists trained the system with a dataset that contained both videos of people’s faces and “ground truth” information: each person’s pulse and respiration rate measured by standard instruments in the field. Using both spatial and temporal data from the videos, the system then determines vital signs.
Specifically, it helps look for essential areas in a video frame that likely contain physiological features correlated with changing blood flow in a face under different contexts, such as different skin tones, lighting conditions, and environments. From there, it can focus on that area and measure the pulse and respiration rate.
Liu said, “We acknowledge that there is still a trend toward inferior performance when the subject’s skin type is darker. This is partly because light reflects differently off of darker skin, resulting in a weaker signal for the camera to pick up. Our team is actively developing new methods to solve this limitation.”
Shwetak Patel, a professor in both the Allen School and the electrical and computer engineering department, said, “Any ability to sense pulse or respiration rate remotely provides new opportunities for remote patient care and telemedicine. This could include self-care, follow-up care, or triage, especially when someone doesn’t have convenient access to a clinic. It’s exciting to see academic communities working on new algorithmic approaches to address this with devices that people have in their homes.”
Journal Reference:
- Xin Liu, Ziheng Jiang, Josh Fromm, Xuhai Xu, Shwetak Patel, Daniel McDuff. MetaPhys: few-shot adaptation for non-contact physiological measurement. CHIL ’21: Proceedings of the Conference on Health, Inference, and Learning, April 2021, pages 154%u2013163 DOI: 10.1145/3450439.3451870
"Smartphone" - Google News
April 03, 2021 at 06:55PM
https://ift.tt/2Po1V5r
Measuring pulse with smartphone - Tech Explorist
"Smartphone" - Google News
https://ift.tt/2QXWyGT
https://ift.tt/2KSW0PQ
Bagikan Berita Ini
0 Response to "Measuring pulse with smartphone - Tech Explorist"
Post a Comment