AI透過分析臉部表情測量疼痛程度

AI透過分析臉部表情測量疼痛程度

Hacker News·

一個新的人工智慧機器學習系統利用攝影機和AI技術,在手術過程中非接觸式地監測病人的疼痛程度,提供客觀的評估方法。

Topics

Sections

More

For IEEE Members

For IEEE Members

IEEE Spectrum

Follow IEEE Spectrum

Support IEEE Spectrum

Enjoy more free content and benefits by creating an account

Saving articles to read later requires an IEEE Spectrum account

The Institute content is only available for members

Downloading full PDF issues is exclusive for IEEE Members

Downloading this e-book is exclusive for IEEE Members

Access to

Spectrum
's Digital Edition is exclusive for IEEE Members

Following topics is a feature exclusive for IEEE Members

Adding your response to an article requires an IEEE Spectrum account

Create an account to access more content and features on

IEEE Spectrum
, including the ability to save articles to read later, download Spectrum Collections, and participate in
conversations with readers and editors. For more exclusive content and features, consider
Joining IEEE
.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to

all of Spectrum’s articles, archives, PDF downloads, and other benefits.
Learn more about IEEE →

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to

this e-book plus all of
IEEE Spectrum’s
articles, archives, PDF downloads, and other benefits.
Learn more about IEEE →

Access Thousands of Articles — Completely Free

Create an account and get exclusive content and features:

Save articles, download collections,
and
post comments
— all free! For full access and benefits,

subscribe

to Spectrum.

Machine Learning System Monitors Patient Pain During Surgery

A camera and AI combo offers a contactless way to assess pain

A camera and AI combo offers a contactless way to assess pain

Image

This article is part of our exclusive IEEE Journal Watch series in partnership with IEEE Xplore.

In the operating room, patients undergoing procedures with local anesthesia, while still conscious, may have difficulty expressing their levels of pain. Some, such as infants or people with dementia, may not be able to communicate these feelings at all. In the search for a better way to monitor patients’ pain, a team of researchers has developed a contactless method which analyzes a combination of patients’ heart rate data and facial expressions to estimate the pain they’re feeling. The approach is described in a study published 14 November in the IEEE Open Journal of Engineering in Medicine and Biology.

Bianca Reichard, a researcher at the Institute for Applied Informatics in Leipzig, Germany, notes that camera-based pain monitoring sidesteps the need for patients to wear sensors with wires, such as ECG electrodes and blood pressure cuffs, which could interfere with the delivery of medical care.

To create their contactless approach, the researchers created a machine learning algorithm capable of analyzing aspects of pain that can be detected visually by a camera. First, the algorithm analyzes the nuances of a person’s facial expressions to estimate their pain levels.

The system also uses heart rate data via a technique called remote photoplethysmogram (rPPG), which involves shining a light on a person’s skin. The amount of light reflected back can be used to detect changes in blood volume within their vessels. The researchers initially considered 15 different heart rate variability parameters measured by rPPG to include in their model, and selected the top seven that are statistically most relevant to pain prediction, such as heart rate maximums, minimums, and intervals.

Pain Prediction Model Training Datasets

The team used two different datasets to train and test their pain prediction model. One is a well-established and widely used database used to measure pain called the BioVid Heat Pain Database. Researchers created this dataset in 2013 through experiments in which thermodes induced incremental, measurable temperature increases on individuals’ skin. The researchers then captured the participants physical responses to the corresponding pain that they felt.

The second dataset was developed by the researchers for this new work. 29 patients undergoing heart procedures involving insertion of a catheter were surveyed about their pain levels at 5 minute intervals.

Importantly, most other pain prediction algorithms have been trained using very short video clips, but Reichard and her team specifically used longer training videos (ranging from 30 minutes to 3 hours) of realistic surgery scenarios to train their model. For instance, the training videos used may have included scenarios where lighting may not be ideal, or the patient’s face may be partially obscured from the camera at times. “This reflects a more realistic clinical situation compared to laboratory data sets,” Reichard explains.

Tests of their model show that it has a pain prediction accuracy of about 45 percent. Reichard says she is surprised that the model is so accurate, given the number of disruptions that occurred throughout the raw video footage, such a patient moving on the operating table or changes in the camera angle. While many previously developed pain prediction models can achieve higher accuracies, those were trained using short video clips that are “ideal” with no visual obstructions. Instead, this research team used less than ideal—but more realistic—video footage to train their model.

What’s more, Reichard notes that the team used a fairly simple statistical machine learning model. “Using more complex approaches, for example based on neural networks, would most likely further improve performance,” she says.

Reichard says she finds this type of research—which could support both patients and medical staff—meaningful, and is planning on developing similar contactless systems for measuring patients’ vital sign using radar in medical settings, in future work.

Michelle Hampson is a freelance writer based in Halifax. She frequently contributes to Spectrum's Journal Watch coverage, which highlights newsworthy studies published in IEEE journals.

![Image](data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 2000 1500'%3E%3C/svg%3E)

This $4,500 Conductive Suit Makes Power Line Work Safer

![Image](data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 5000 3335'%3E%3C/svg%3E)

Wireless Power Beamed From Moving Aircraft

![Image](data:image/svg+xml,%3Csvg xmlns='http://www.w3.org/2000/svg' viewBox='0 0 2000 1250'%3E%3C/svg%3E)

ALMA Upgrade Boosts Cosmic Observation Power

Hacker News

相關文章

  1. 利用機器學習優化乳癌篩檢工作流程

    Google Research · 大約 1 個月前

  2. 基於AI的肌電圖報告:一項隨機對照試驗

    pmc.ncbi.nlm.nih.gov · 2 個月前

  3. AI生成感測器開創早期癌症偵測新途徑

    4 個月前

  4. 在真實臨床研究中探索對話式診斷人工智慧的可行性

    Google Research · 大約 1 個月前

  5. 用於從肌電訊號連續估計手部運動學的平行高效Transformer深度學習網路

    Nature · 6 個月前