Facial expression is an important component of the pain measurement process, which can be utilized by clinicians in patients with impaired communicative or cognitive abilities. However, given the complex nature of the pain and the absence of a standard reference, accurate measurement of pain in hospital settings is a difficult practice. In this paper, we present a vision-based measurement system aimed at automatically measuring the pain intensity values through the analysis of the facial expression, based on deep-learning strategies. For the scope of providing a metrological characterization of the system, we designed a benchmark for the estimation of the measurement precision via multiexpert judgments. As a major contribution of this paper, we investigated the interobserver and intraobserver variability as sources of uncertainty at the calibration level in order to provide a robust pain estimation via an automatic framework. Reproducibility and standard reference uncertainty (due to interobserver and intraobserver variability, respectively) have been estimated, modeled, and propagated through the proposed platform using the Guide to the expression of Uncertainty in Measurement (GUM)-compliant procedures. Numerical results, provided in terms of average values and confidence intervals of the accuracy of classification, indicate that such a system can facilitate decision-making in healthcare. Moreover, the proposed benchmark offers new hints to incorporate, rationally, subjective expert judgments within automatic measurement systems.

Calibration of Vision-Based Measurement of Pain Intensity with Multiple Expert Observers

Natoli S.;
2019-01-01

Abstract

Facial expression is an important component of the pain measurement process, which can be utilized by clinicians in patients with impaired communicative or cognitive abilities. However, given the complex nature of the pain and the absence of a standard reference, accurate measurement of pain in hospital settings is a difficult practice. In this paper, we present a vision-based measurement system aimed at automatically measuring the pain intensity values through the analysis of the facial expression, based on deep-learning strategies. For the scope of providing a metrological characterization of the system, we designed a benchmark for the estimation of the measurement precision via multiexpert judgments. As a major contribution of this paper, we investigated the interobserver and intraobserver variability as sources of uncertainty at the calibration level in order to provide a robust pain estimation via an automatic framework. Reproducibility and standard reference uncertainty (due to interobserver and intraobserver variability, respectively) have been estimated, modeled, and propagated through the proposed platform using the Guide to the expression of Uncertainty in Measurement (GUM)-compliant procedures. Numerical results, provided in terms of average values and confidence intervals of the accuracy of classification, indicate that such a system can facilitate decision-making in healthcare. Moreover, the proposed benchmark offers new hints to incorporate, rationally, subjective expert judgments within automatic measurement systems.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11571/1488631
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 14
  • ???jsp.display-item.citation.isi??? 12
social impact