Skip to content
Health

Detecting patients’ pain levels via their brain signals

The system could help with diagnosing and treating patients that cannot communicate.

Pat Greenhouse/The Boston Globe via Getty Images

Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Researchers from MIT and elsewhere have developed a system that measures a patient’s pain level by analyzing brain activity from a portable neuroimaging device.


The system could help doctors diagnose and treat pain in unconscious and noncommunicative patients, which could reduce the risk of chronic pain that can occur after surgery.

Pain management is a surprisingly challenging, complex balancing act. Overtreating pain, for example, runs the risk of addicting patients to pain medication. Undertreating pain, on the other hand, may lead to long-term chronic pain and other complications. Today, doctors generally gauge pain levels according to their patients’ own reports of how they’re feeling. But what about patients who can’t communicate how they’re feeling effectively — or at all — such as children, elderly patients with dementia, or those undergoing surgery?

In a paper presented at the International Conference on Affective Computing and Intelligent Interaction, the researchers describe a method to quantify pain in patients. To do so, they leverage an emerging neuroimaging technique called functional near infrared spectroscopy (fNIRS), in which sensors placed around the head measure oxygenated hemoglobin concentrations that indicate neuron activity.

For their work, the researchers use only a few fNIRS sensors on a patient’s forehead to measure activity in the prefrontal cortex, which plays a major role in pain processing. Using the measured brain signals, the researchers developed personalized machine-learning models to detect patterns of oxygenated hemoglobin levels associated with pain responses. When the sensors are in place, the models can detect whether a patient is experiencing pain with around 87 percent accuracy.

“The way we measure pain hasn’t changed over the years,” says Daniel Lopez-Martinez, a PhD student in the Harvard-MIT Program in Health Sciences and Technology and a researcher at the MIT Media Lab. “If we don’t have metrics for how much pain someone experiences, treating pain and running clinical trials becomes challenging. The motivation is to quantify pain in an objective manner that doesn’t require the cooperation of the patient, such as when a patient is unconscious during surgery.”

Traditionally, surgery patients receive anesthesia and medication based on their age, weight, previous diseases, and other factors. If they don’t move and their heart rate remains stable, they’re considered fine. But the brain may still be processing pain signals while they’re unconscious, which can lead to increased postoperative pain and long-term chronic pain. The researchers’ system could provide surgeons with real-time information about an unconscious patient’s pain levels, so they can adjust anesthesia and medication dosages accordingly to stop those pain signals.

Joining Lopez-Martinez on the paper are: Ke Peng of Harvard Medical School, Boston Children’s Hospital, and the CHUM Research Centre in Montreal; Arielle Lee and David Borsook, both of Harvard Medical School, Boston Children’s Hospital, and Massachusetts General Hospital; and Rosalind Picard, a professor of media arts and sciences and director of affective computing research in the Media Lab.

Focusing on the forehead

In their work, the researchers adapted the fNIRS system and developed new machine-learning techniques to make the system more accurate and practical for clinical use.

To use fNIRS, sensors are traditionally placed all around a patient’s head. Different wavelengths of near-infrared light shine through the skull and into the brain. Oxygenated and deoxygenated hemoglobin absorb the wavelengths differently, altering their signals slightly. When the infrared signals reflect back to the sensors, signal-processing techniques use the altered signals to calculate how much of each hemoglobin type is present in different regions of the brain.

When a patient is hurt, regions of the brain associated with pain will see a sharp rise in oxygenated hemoglobin and decreases in deoxygenated hemoglobin, and these changes can be detected through fNIRS monitoring. But traditional fNIRS systems place sensors all around the patient’s head. This can take a long time to set up, and it can be difficult for patients who must lie down. It also isn’t really feasible for patients undergoing surgery.

Therefore, the researchers adapted the fNIRS system to specifically measure signals only from the prefrontal cortex. While pain processing involves outputs of information from multiple regions of the brain, studies have shown the prefrontal cortex integrates all that information. This means they need to place sensors only over the forehead.

Another problem with traditional fNIRS systems is they capture some signals from the skull and skin that contribute to noise. To fix that, the researchers installed additional sensors to capture and filter out those signals.

An emerging neuroimaging technique called functional near infrared spectroscopy (fNIRS) could help detect pain.

Pat Greenhouse/The Boston Globe via Getty Images

Personalized pain modeling

On the machine-learning side, the researchers trained and tested a model on a labeled pain-processing dataset they collected from 43 male participants. (Next they plan to collect a lot more data from diverse patient populations, including female patients — both during surgery and while conscious, and at a range of pain intensities — in order to better evaluate the accuracy of the system.)

Each participant wore the researchers’ fNIRS device and was randomly exposed to an innocuous sensation and then about a dozen shocks to their thumb at two different pain intensities, measured on a scale of 1-10: a low level (about a 3/10) or high level (about 7/10). Those two intensities were determined with pretests: The participants self-reported the low level as being only strongly aware of the shock without pain, and the high level as the maximum pain they could tolerate.

In training, the model extracted dozens of features from the signals related to how much oxygenated and deoxygenated hemoglobin was present, as well as how quickly the oxygenated hemoglobin levels rose. Those two metrics — quantity and speed — give a clearer picture of a patient’s experience of pain at the different intensities.

Importantly, the model also automatically generates “personalized” submodels that extract high-resolution features from individual patient subpopulations. Traditionally, in machine learning, one model learns classifications — “pain” or “no pain” — based on average responses of the entire patient population. But that generalized approach can reduce accuracy, especially with diverse patient populations.

The researchers’ model instead trains on the entire population but simultaneously identifies shared characteristics among subpopulations within the larger dataset. For example, pain responses to the two intensities may differ between young and old patients, or depending on gender. This generates learned submodels that break off and learn, in parallel, patterns of their patient subpopulations. At the same time, however, they’re all still sharing information and learning patterns shared across the entire population. In short, they’re simultaneously leveraging fine-grained personalized information and population-level information to train better.

The personalized models and a traditional model were evaluated in classifying pain or no-pain in a random hold-out set of participant brain signals from the dataset, where the self-reported pain scores were known for each participant. The personalized models outperformed the traditional model by about 20 percent, reaching about 87 percent accuracy.

“Because we are able to detect pain with this high accuracy, using only a few sensors on the forehead, we have a solid basis for bringing this technology to a real-world clinical setting,” Lopez-Martinez says.

Reprinted with permission of MIT News. Read the original article.

Sign up for Smart Faster newsletter
The most counterintuitive, surprising, and impactful new stories delivered to your inbox every Thursday.

Related

Up Next