Skip to main content
Premium Trial:

Request an Annual Quote

At AACR, Google Researchers Introduce AI-Based Cancer Dx, Augmented-Reality Microscope

CHICAGO (360Dx) – At the annual meeting of the American Association for Cancer Research in Chicago Monday, a session on elucidating the complexities of cancer included a talk by Google's Martin Stumpe and Jason Hipp on the advances the company is making in developing cancer diagnostics using artificial intelligence.

Google's technological forays into cancer research made an appearance at last year's annual AACR meeting as well, when the firm's Mark DePristo spoke about his company's efforts to develop deep machine learning and apply those advances to medicine. He particularly highlighted the machines' proficiency in digital pathology and said at the time that his team was trying to develop a deep-learning algorithm for germline variant calling by encoding sequencing data as images and training a deep-learning machine to determine the genotype from the image. Though it was still in the early stages, DePristo said he and his colleagues were trying to extend this technology to the study of cancer.

This year, Stumpe and Hipp — a technology lead and a lead pathologist at Google, respectively — described the firm's newest cancer diagnostics, which were developed using AI.

The first diagnostic specifically detects metastatic breast cancer in lymph nodes. This is a difficult endeavor for pathologists because biopsy images are so large and pathologists are often searching for very small amounts of metastasis within those images, Stumpe said. Further, some tissue can mimic the look of metastasis in the lymph nodes even though it's benign. Finally, the sheer amount of data that a pathologist must look through to make a diagnosis can be overwhelming. One biopsy image is equivalent to about 1,000 images from a DSLR camera.

In order to help solve these problems, Google trained an AI algorithm to detect metastases in lymph nodes. The Google team split biopsy images into millions of tiles, with each image being split into 1 million tiles, and labeled each tile as benign or metastatic. After millions of iterations, the AI network was able to learn whether to classify an image as positive or negative for metastasis, Stumpe said.

When the team evaluated the performance of the AI algorithm, Hipp noted, it found that the AI was able to correctly distinguish between tumor tissue and benign mimics. When the researchers compared the performance of the AI algorithm to pathologists in finding metastases in lymph node biopsy images, the algorithm scored a .91 on a tumor localization score compared to a .73 for the pathologists.

Further, Hipp revealed, the Google team conducted a feasibility study to evaluate the utility of the algorithm to pathologists. In the study, which is yet to be published, the researchers found that when they asked six pathologists to analyze biopsy images, the AI improved their speed-to-diagnosis. There was a time savings of 11 percent in negative cases, and a time savings of 32 percent in micro-metastasis cases when pathologists used the algorithm to help them versus when they didn't, Hipp said.

Hipp and Stumpe also developed a similar algorithm to help pathologists assign Gleason scores to prostate cancer tumors, which can sometimes be difficult and can be subjective. The team asked pathologists to evaluate prostate cancer images and grade them according to the Gleason scale. The pathologists varied in how they rated the prostate tumors, Stumpe and Hipp noted, but overall, the AI algorithm performed the same as the pathologists. In yet-to-be-published data, the Google team found that the algorithm and the pathologists both performed on par with each other.

But before pathologists can really take advantage of these types of AI-based diagnostics, Stumpe said, they have to have access to high-end digital scanners. Most pathologists are still working with standard microscopes, and the expense, significant changes to IT infrastructures, and disruptions to existing workflows that it would take to upgrade to fully digital scanners are a barrier to adoption in the near term.

In order to bridge the gap, Stumpe and Hipp have brought AI to the microscope, developing an augmented-reality system that can attach to standard microscopes. The system adds a small camera to a standard microscope, feeding the image of a sample under the eyepiece to a computer unit that has AI capabilities, and uploading the tumor probabilities back to a microdisplay in the eyepiece of the microscope. In the eyepiece, the user sees the sample with a visual overlay of the AI analysis outlining areas of interest.

As of now, the system works in breast and prostate cancers as it has been trained to work like the AI-based diagnostics the team developed. The prostate overlay in the microscope includes suggestions for Gleason scoring.

In a study, the team found that the system has an AUC of .98 for breast cancer and .96 for prostate cancer, meaning it has fairly high accuracy, though there is still room for improvement, Stumpe said. The system could potentially be trained to analyze other cancers as well, he added, and to look for rare event, mitotic figures, mycobacteria, and biomarkers such as Ki67, PR, or P53.

The microscope system is currently in the prototype stage, and the team is looking for future beta testers. Stumpe also noted that it's been built it in a modular way, so it can be retrofitted to almost any kind of regular microscope.