advertisement
advertisement

Google’s new AI dermatologist can help you figure out what that mole is

Google is putting its artificial intelligence chops to use identifying possible skin conditions.

Google’s new AI dermatologist can help you figure out what that mole is
[Photo: courtesy of Google]
advertisement
advertisement
advertisement

On Tuesday, at Google’s annual I/O developer conference, the company announced the launch of a new search tool for skin, nail, and hair conditions to serve the two billion people around the world who suffer from them. The technology, validated in a paper published in Nature last year, is nearly as good as a dermatologist at identifying 26 skin conditions, and more accurate than the primary care physicians and nurses in the study. The new search tool, which will launch later this year, serves as another example of how the company thinks that it can support doctors and patients through everyday products.

advertisement
advertisement

The dermatological assistant lives inside Google Search and requires a 3G minimum connection. To use it, a person must provide consent and then upload three well-lit photos. The program will ask them a series of questions about their condition. You can bypass this section, but Google product manager and physician Dr. Lily Peng says answering these questions will make the results more accurate. Then the tool will serve up a list of possible matches, with the top three being the most likely culprits. If the AI is less confident in its suggestions, it will note that it is still learning about certain conditions. In addition to naming skin conditions, the tool will show articles and other related content.

[Image: courtesy of Google]
Users can either save their results, delete them, or donate them to Google’s internal research efforts. For those that choose to save or donate, the data is encrypted both in storage and transit, and the company says it will not use the data to target ads.

[Image: courtesy of Google]
During the three years of research and development that went into the tool, Google trained its dermatological assistant on millions of de-identified skin images. To ensure its technology worked across skin types and tones, Google partnered with 17 clinics to bring in 65,000 de-identified photos of patients’ skin. It can now identify 288 skin, hair, and nail conditions among the over 3,000 conditions that fall within the purview of a dermatologist, according to the American Academy of Dermatology Association.

advertisement

Google says its dermatological assistant is not a diagnostic tool, though both the U.S. and European governments have classified it as a low-risk medical device. Rather, the tool is supposed to help guide patients as they engage with their doctor. “As a doctor I want to make sure my patients have the information and tools they need to make the best decisions and to come to the clinic really informed,” says Karen DeSalvo, Google’s chief health officer, and a former assistant secretary for health at the United States Department of Health and Human Services under the Obama administration.

Animation: courtesy of Google]

The technology also has the potential to help doctors make better diagnoses at the clinic. While it does not outperform dermatologists, it can assist nonspecialists. A recent JAMA study found the technology improved both primary care doctors’ and nurses’ ability to identify skin conditions. That could be a particular help to doctors in rural areas where there are few dermatologists. It might also help patients who have a hard time finding dermatological care.

“Derm is notoriously challenging, and the ability to up-level the diagnostics if we can get to that place with the tool . . . would be helpful not only for clinicians,” says DeSalvo.

advertisement

[Image: courtesy of Google]
When people can’t access a dermatologist, they already turn to the web for answers. Every year, Google fields 10 billion search queries related to skin, hair, and nail issues, says DeSalvo, but users only find relevant information 13% of the time.

This year Google has pushed more health tools into everyday products. In February, the company launched a heart and breath rate monitor within its Pixel phone. It also put sleep detection inside its smart home device, the Nest Hub. But Google has bigger public health ambitions. It continues to invest in projects with academic partners building out artificially intelligent screening tools for breast cancer, blindness caused by diabetes, and tuberculosis. And more and more, its health efforts are popping up outside the lab and landing in the palm of your hand.

About the author

Ruth Reader is a writer for Fast Company. She covers the intersection of health and technology.

More