What do an eye hospital in Madurai, Tamil Nadu and Google founder Larry Page have in common? Quite simply, the future of eyecare.
In 2006, Page flew down to the south Indian temple town to stitch what was then deemed a mere “long-term working relationship” with Aravind Eye Care. Google didn’t have to do much initially. It trained doctors and other staff in IT and assisted the hospital with its distance education programme. But soon after, the two partners put their heads together to solve one of the most pressing problems in eyecare—prevention of blindness. More specifically, Aravind Eye Care, and subsequently, Sankara Nethralaya, wanted to work with Google to develop and validate deep-learning algorithms to help in detecting diabetic retinopathy—an eye condition that is the most common cause of vision loss among diabetic people.
Today, after working on 130,000 images from these hospitals with 54 American ophthalmologists, Google has an algorithm, rather a labelling tool, that grades eye images for disease detection. And no surprise, it does the task a wee bit better than humans. “Our algorithm very closely matched the performance of the American eye doctors. In the ‘F score’, which Google uses to measure its accuracy, the algorithm did just a little better than the human ophthalmologist,” said Lily Peng, a product manager at Google, during a conversation at ‘Made with AI’ in Tokyo* on Tuesday.
It’s interesting how far she has come. Just two years ago, Peng was busy dealing with health-related searches as part of the company’s search team. At that time, Google was dabbling in healthcare through applications like Google Fit and its research initiative was focused on improving ageing through its biotech venture Calico. Since Google was actively integrating artificial intelligence (AI) and machine learning into its core products then, Peng and her colleagues got started on a side project with a hypothesis. “I, along with other folks, saw how machine learning was transforming our other core products. And we thought if we were able to detect dog breeds [through] images, maybe we could also help with medical images,” says Peng.
This was also before Google’s current CEO Sundar Pichai reoriented the company’s philosophy from a then “mobile-first” company to the current “AI-first” company. Little did Google realise that what initially began as a side project, would result in the company’s flagship AI stamp on healthcare.
Apart from Verily Life Sciences and DeepMind Health, it has acquired a few startups and is incubating fresh ones through Launchpad Studio. The search giant isn’t articulating set goals like it did with, say, driverless cars. For now, Google has two core objectives with regard to AI in healthcare. One, make a tool that makes it easier to find diseases. Two, help strengthen research across communities. While it does see a business opportunity, and according to Google, a more enterprise one at that, it is in no hurry to monetise its algorithms.