An extended-reality (XR) platform for real-time monitoring of patients’ health during surgical procedures is proposed. The proposed system provides real-time access to a comprehensive set of patients’ information, which are made promptly available to the surgical team in the operating room (OR). In particular, the XR platform supports the medical staff by automatically acquiring the patient’s vitals from the operating room instrumentation and displaying them in real-time directly on an XR headset. Furthermore, information regarding the patient clinical record is also shown upon request. Finally, the XR-based monitoring platform also allows displaying in XR the video stream coming directly from the endoscope. The innovative aspect of the proposed XR-based monitoring platform lies in the comprehensiveness of the available information, in its modularity and flexibility (in terms of adaption to different sources of data), ease of use, and most importantly, in a reliable communication, which are critical requirements for the healthcare field. To validate the proposed system, experimental tests were conducted using instrumentation typically available in the operating room (i.e., a respiratory ventilator, a patient monitor for intensive care, and an endoscope). The overall results showed (i) an accuracy of the data communication greater than 99 %, along with (ii) an average time response below ms, and (iii) satisfying feedback from the SUS questionnaires filled out by the physicians after intensive use.
An innovative algorithm to automatically assess blood perfusion quality of the intestinal sector in laparoscopic colorectal surgery is proposed. Traditionally, the uniformity of the brightness in indocyanine green-based fluorescence consists only in a qualitative, empirical evaluation, which heavily relies on the surgeon’s subjective assessment. As such, this leads to assessments that are strongly experience-dependent. To overcome this limitation, the proposed algorithm assesses the level and uniformity of indocyanine green used during laparoscopic surgery. The algorithm adopts a Feed Forward Neural Network receiving as input a feature vector based on the histogram of the green band of the input image. It is used to (i) acquire information related to perfusion during laparoscopic colorectal surgery, and (ii) support the surgeon in assessing objectively the outcome of the procedure. In particular, the algorithm provides an output that classifies the perfusion as adequate or inadequate. The algorithm was validated on videos captured during surgical procedures carried out at the University Hospital Federico II in Naples, Italy. The obtained results show a classification accuracy equal to $$99.9\%$$
99.9
%
, with a repeatability of $$1.9\%$$
1.9
%
. Finally, the real-time operation of the proposed algorithm was tested by analyzing the video streaming captured directly from an endoscope available in the OR.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.