NEW YORK (360Dx) – In a commentary published in the journal JAMA Oncology last week, Balazs Acs and David Rimm, professors in the department of pathology at Yale University School of Medicine, reviewed a study conducted by a group of researchers in the Netherlands who assessed the performance of deep-learning algorithms for detection of lymph node metastasis in hematoxylin and eosin-stained tissue sections from 129 breast cancer cases.
In a competition, some deep-learning algorithms achieved better diagnostic performance than a pathologists' panel, the researchers found. They, along with Rimm and Acs, concluded that the approach will require evaluation and proof of its utility in a clinical setting. However, in an interview with 360Dx, Rimm said that the study could have significant consequences for adoption of digital pathology systems that until recently have been slow to convince potential customers.
Rimm, who is also director of Yale Pathology Tissue Services, was a scientific cofounder of HistoRx, a digital pathology company sold to Genoptix in 2012; Metamark Genetics, which develops diagnostic and prognostic tests for urology cancer care; and Pixel Gear, a direct-tissue imaging company.
Below is an edited version of the interview.
What are the implications of the results of the Netherlands study?
We can perform a machine learning task that can exceed the capability of a pathologist, and that finally gives us a reason to scan slides and use digital pathology. Computer algorithms have done a better job than we could do in making sure we're delivering the right diagnosis. We've added value to the process. My thought is that this is not digital pathology but intelligent digital pathology, because it brings intelligence and, in this case, artificial intelligence to the process.
Are there any commercial or near-commercial digital pathology systems that consist of deep learning algorithms?
No commercial digital pathology system with a deep learning algorithm has been cleared. So now the question is whether we can go ahead and use these algorithms with digital pathology systems, or whether we will need to obtain US Food and Drug Administration approval for their use. And there's no guideline for that.
What will get this closer to the clinic?
To some extent, it is already happening. People are working on algorithms, for example, that can identify the EGFR or BRAF mutation status on the basis of a [hematoxylin and eosin] stain, so instead of having to wait three weeks for sequencing, you can receive a result almost instantaneously if you have a digital image and can run an algorithm that tells you the patient has a high percentage chance of having a mutation.
If its sensitivity and specificity is 97 percent or higher, you might act on that and go ahead and get the sequencing done anyway. [Although] I have not yet seen data that anyone can do this, and most of the data for deep learning algorithms is still below that level of sensitivity and specificity, it is probably not too far away. A clinician might immediately act on the indication, prior to sequencing, if it ultimately has high enough sensitivity and specificity. Although the test is looking at morphologic features, it's predicting a molecular change, and when that happens digital pathology could change care.
What is intelligent digital pathology and how does it compare with both standard pathology and digital pathology?
Regular pathology involves looking through tubes at an optical image of tissue on a glass slide. Digital pathology involves staining a slide and recording the result with a camera. Intelligent digital pathology involves applying deep learning algorithms to evaluate those whole-slide pathology images.
Digital pathology has failed to gain much traction over the past decade or so. What have been the main factors preventing adoption?
Although it is a novel advance on standard pathology, digital pathology has had relatively few applications where its benefits provided enough value for pathologists. Use for teaching and by tumor boards have been among the main applications. Digital pathology has proved useful when you want to capture an image and bring it to someone because you couldn't bring them the microscope or slide. So, digital pathology has been most important as a mechanism for sharing images.
Why was the US Food and Drug Administration hesitant to clear digital pathology systems for so long?
The US Food and Drug Administration ruled some years ago that digital pathology was not usable in the clinic, and that meant that companies couldn't market their machines.
With early systems, there were also issues related to resolution, and the FDA said that digital pathology hadn't been comprehensively tested with all different conditions and tumor types. Ultimately, to obtain FDA approval, digital pathology vendors had to test many cases of many different tumor types to ensure that there wasn't one tumor that was particularly challenging to review.
Although companies couldn't market the use of digital pathology systems for clinical purposes, clinicians — especially when they were distributed among multiple sites — still frequently scanned slides that were read by a central pathologist, which was, I think, technically legal.
Would you say, then, that the main reason that digital pathology has had trouble achieving broad adoption is because it didn't provide much of a benefit over traditional pathology?
Yes. You could read the slide on your computer screen or in your microscope and it looked the same. However, there often was not a compelling reason to go to the extra expense of scanning in the slide, manipulating it with software, and having it on your computer, when you could just read the glass slide through a microscope and get the same result. By adhering to use of a microscope, you saved money by not purchasing a scanner, paying people to operate it, and paying for slide storage.
On the other hand, the resolution and speed of digital imaging systems increased over the years, and [in April 2017] the FDA cleared Philips digital pathology solution, making it the first time that the agency allowed the marketing of a whole slide imaging system for pathology. The firm has begun a big push towards digitization of slides and use of digital pathology.
What else has changed for digital pathology with respect to its potential for adoption?
The contest held in the Netherlands compared the performance of machine-learning algorithms running on a digital pathology system to that of pathologists reading the digital images.
The machine-learning system was required to look at slides and find metastatic breast cancer cells in lymph nodes. Combining deep learning and machine learning, several groups developed algorithms that could find those malignant cells and shade them and count the number of them, which is what a pathologist would normally do.
How did the machines perform?
The group running the study designed it so that 11 pathologists read the slides in a normal, clinical fashion with limited time and one pathologist read them without time limitations. This was a well-chosen test because a molecular method already exists to find these rare malignant cells. By this method, we stain them with cytokeratin and the malignant cells that are positive are visible compared to the lymphocytes and other lymph-node stromal cells in the background.
The contestants downloaded the key images, built software programs with algorithms, evaluated the slides, and sent their results back to the study organizers. The organizers then compared the performance of the algorithms with the performance of the 11 pathologists, the single pathologist, and the molecular test. Seven of the algorithms had higher sensitivities and specificities than the pooled results of the pathologists, and even the pathologist who had no time limitation did not do as well as the top three machine algorithms. These algorithms are not FDA approved and it is not yet clear how they will be used in the clinic. [However], in the future, one can image clearance of algorithms to do tasks previously done by pathologists, thereby generating a tool to strengthen the pathologist diagnosis.