Phone cameras can now measure your pulse and breathing

The technique could expand the capabilities of telemedicine.

Telemedicine has taken off since the pandemic, since healthcare through a camera carries no COVID-19 risk. 

There are limits to what doctors and nurses can learn remotely, however — diagnostic tests and medical measurements that may require physical contact, or close examination of nuances that can be lost over the internet.

Researchers at the University of Washington have devised a way to measure patients’ respirations rates and pulse via a computer or smartphone’s camera, in an attempt to make telemedicine more accurate and useful.

According to UW News, the system, called MetaPhys, works by using a machine learning algorithm to identify and track changes in how light reflects off a patient’s face. Those subtle differences correlate with changes in blood flow, from which pulse and respiration rate can be deduced.

“Machine learning is pretty good at classifying images. If you give it a series of photos of cats and then tell it to find cats in other images, it can do it,” Xin Liu, the study’s lead author and a Paul G. Allen School of Computer Science & Engineering doctoral student, told UW News. 

“But for machine learning to be helpful in remote health sensing, we need a system that can identify the region of interest in a video that holds the strongest source of physiological information — pulse, for example — and then measure that over time.”

The team’s original iteration of the system was presented this past December at the Neural Information Processing Systems conference; this second version, which the researchers presented on April 8 at the Association for Computing Machinery’s Conference on Health, Interface, and Learning, avoids some of the pitfalls that snared its predecessor.

“There are large individual differences in physiological processes, making designing personalized health sensing algorithms challenging,” the researchers wrote in their arXiv-available paper. “Existing machine learning systems struggle to generalize well to unseen subjects or contexts and can often contain problematic biases.”

For instance, because the algorithm is looking at reflection of light off the skin, different backgrounds, lighting, and skin color can throw MetaPhys off. 

The first version of the system was trained using a dataset filled with people’s faces, as well as their independently measured pulse and respiration rates. But presented with data too different from this dataset, its performance declined.

MetaPhys improves on these issues by creating personalized machine learning models for each patient, picking out places to focus on while under different circumstances, including lighting and skin color. Even still, however, the system needs work. 

“We acknowledge that there is still a trend toward inferior performance when the subject’s skin type is darker,” Liu said. 

“This is in part because light reflects differently off of darker skin, resulting in a weaker signal for the camera to pick up. Our team is actively developing new methods to solve this limitation.”

That issue, which is obviously a cause for concern, slots into the category of what the authors call “individual differences,” one of four major challenges facing MetaPhys and other camera-based telehealth measurement systems. Facial hair is something else to overcome.

“Environmental differences” include variations in lighting, while the “sensor differences” between cameras can mean differences in sensitivity. Finally, there are the “contextual differences” in a video, if the patient is performing an action that the algorithm did not encounter in its training data.

Despite the challenges, systems like MetaPhys will likely be needed if telehealth is going to fully live up to the hype. 

We’d love to hear from you! If you have a comment about this article or if you have a tip for a future Freethink story, please email us at [email protected].

Related
See how Moderna is using OpenAI tech across its workforce
A partnership between Moderna and OpenAI provides a real-world example of what can happen when a company leans into generative AI.
Shining a light on oil fields to make them more sustainable
Sensors and analytics give oil well operators real-time alerts when things go wrong, so they can respond before they become disasters.
OpenAI’s GPT-4 outperforms doctors in another new study
OpenAI’s most powerful AI model, GPT-4, outperformed junior doctors in deciding how to treat patients with eye problems.
Watch the first AI vs. human dogfight using military jets
An AI fighter pilot faced off against a human pilot in a “dogfight” using actual planes — a huge milestone in military automation.
AI can help predict whether a patient will respond to specific tuberculosis treatments
Instead of a one-size-fits-all treatment approach, AI could help personalize treatments for each patient to provide the best outcomes.
Up Next
oral insulin
Subscribe to Freethink for more great stories