NEW YORK — By combining optical imaging with artificial intelligence, a research team led by scientists from the University of Michigan has developed a model that can classify surgical samples of common brain tumors in real time as accurately as a pathologist.
The approach, the investigators write in a report appearing in Nature Medicine on Monday , could also be extended to other cancers where intraoperative histology is an important part of clinical care.
Brain cancer patients often undergo surgery during which a tumor is biopsied in order to provide a preliminary diagnosis and to help guide surgical management. Specimens, however, typically must be shipped to a laboratory where they are processed, slides are prepared, and samples reviewed by a pathologist.
In order to streamline this process, the UM investigators developed a workflow based on stimulated Raman histology (SRH), a label-free optical imaging method that provides submicrometer-resolution images of unprocessed biologic tissues by using the intrinsic vibrational properties of lipids, proteins, and nucleic acids to provide image contrast.
SRH imagers are available for use in operating rooms, but diagnostic interpretation by a trained neuropathologist is still required and such professionals are not always available, the investigators wrote.
Hypothesizing that AI could be used to assist in the interpretation of histologic images and expand access to expert-level intraoperative diagnosis, the researchers included a deep convolutional neural network (CNN) for diagnostic prediction into their workflow.
The CNN was trained on over 2.5 million SRH images to classify tissue into 13 histologic categories, with a focus on commonly encountered brain tumors, in under 150 seconds. In a prospective clinical study of 278 brain cancer patients at three hospitals, CNN-based diagnosis of SRH images had an overall accuracy of 94.6 percent compared with 93.9 percent for pathologist-based interpretation of conventional histologic images.
The investigators' model made 14 errors, nine of which were glial tumors, which share morphological characteristics with other tumors despite divergent clinical presentations and radiographic appearances. This compares with 17 errors made by pathologists, 10 of which were malignant gliomas.
Highlighting the utility of CNN-based SRH diagnosis as an aid to pathologists, the model accurately classified all of the pathologists' incorrect diagnoses, while the pathologists correctly diagnosed all of the cases misdiagnosed by the model.
Notably, the research team also implemented a semantic segmentation method to identify tumor-infiltrated diagnostic regions within SRH images.
"Our workflow provides a transparent means of delivering expert-level intraoperative diagnosis where neuropathology resources are scarce, and improving diagnostic accuracy in resource-rich centers," the scientists wrote. "The workflow also allows surgeons to access histologic data in near real-time, enabling more seamless use of histology to inform surgical decision-making based on microscopic tissue features."
Although the workflow was developed in the context of neurosurgical oncology, many histological features used to diagnose brain cancers are applicable to tumors in other organs, they added. As such, the approach could likely be extended to dermatology, head and neck surgery, breast surgery, and gynecology.