The suggested link between cell phones and cancer has been around for decades, but it's never been ameliorative until recently. Now a new project currently at the University of Texas is exploring whether an iPhone app can identify a suspicious mole or lesion and determine whether it is likely to be cancerous. But would you actually use an app without going to the doctor, too? What's the point of a medical app if people don't trust the tech as much as a human?
DermoScreen, developed by University of Houston professor Dr. George Zouridakis and the MD Anderson Cancer Center, works simply enough. Users snap a picture of a potentially problematic mole or lesion, then the app automatically analyzes the picture using algorithms based on the same criteria used by professional dermatologists to identify cancerous growths—namely the so-called ABCD rule, 7-point checklist, and Menzies' method.
But here's the crazy thing: Early testing of the technology has shown it to be accurate about 85% of the time, which is similar to the accuracy rate for trained dermatologists—and more accurate than non-specialist primary care physicians.
There have been other apps in the past that have taken advantage of the iPhone’s built-in camera to help screen for cancer, but they've always used human beings on the other end of the app. "Most [apps claiming to do the same job] offer a telemedicine type service, whereby a user uploads a picture to a website and then it is analyzed off line by a dermatologist," Zouridakis says. Several years ago the University of Michigan created UMSkinCheck, an app that does a similar task as DermoScreen, but using human beings in a mechanical-turk-like way to analyze user photos.
Within 10-15 seconds of taking a picture, the app provides a score based on a series of objective dermoscopic rules, which look for the presence or absence of certain characteristic features of melanoma. These include lesion geometry (whether the lesion is symmetric or not), color (is it single or multi color? Is there the presence of blue-white colors, or black dots?) and texture (is it smooth or rough? Are there irregular streaks, or star-like pigmentation?).
To enable the app to spot the low-level features not always visible to the human eye, Zouridakis’ development—at least in its current state—uses what is called a dermoscope, a magnification accessory, with a built-in light, which clips on to the iPhone like an Olloclip detachable lens. The result is hospital-quality imaging from the comfort of your own home.
DermoScreen is also not your everyday app. Far from a side project, or a quick hack to take advantage of changing iPhone features, the app’s development has occupied Zouridakis’ professional focus for most of the past decade (even though the iPhone has only been around for seven years).
Prior to moving to his current position at the University of Houston, Zouridakis was a faculty member at the University of Texas Medical School. One day a colleague showed him an early model dermoscope—a prototype of the same device DermoScreen uses today. Zouridakis was impressed by what the technology could do with the aid of a digital camera, but felt the process was unnecessarily slow.
"The images captured with the camera had to be transferred manually to a PC for offline visual inspection," he says. "I told my colleague that we could develop automated analysis algorithms and even put a microprocessor in the dermoscope to do all the analysis on the device in real time. He said, ‘Sure, go ahead!’"
Zouridakis did. When he moved to UH Computer Science, he began working on the subject of automated lesion analysis. While he was doing this he was approached by a student doing a PhD dissertation, who was also a full-time employee of Texas Instruments. By coincidence, another colleague from Electrical Engineering was also supervising a different PhD student from Texas Instruments. Together the team worked to develop software and design boards, eventually building two separate hardware prototypes that could capture images, process them in real time, transmit them via Bluetooth to a PC, and display them on a monitor. In 2007, Zouridakis carried out a demo of his first hardware prototype at the Texas Instrument Developer Conference in Dallas.
The real breakthrough, Zouridakis says, was the launch of the iPhone that year. Once he saw a portable device featuring a built-in camera and enough computing power to process its image, he knew he had found what he was looking for. "After that, we focused exclusively on software," he says. "Several PhD dissertations and MS theses later, we have better and faster algorithms, while the iPhone itself is faster and more efficient."
Today DermoScreen is getting closer to the necessary approval stages needed to create a public-facing product. "We have [successfully] tested the software engine behind the app with thousands of images published in databases and atlases of dermoscopy," Zouridakis says. The recent partnership with MD Anderson Cancer Center, arguably the best cancer center in the word, will conduct prospective studies and validate the technology against the gold standard of medical practice.
Currently the app is not ready to be rolled out—its interface being unfinished, for one thing. But it’s getting there—and Zouridakis is also starting to look into other diagnostic uses for the technology, such as testing the device’s ability to screen for Buruli ulcer, a flesh-eating bacterial disease, in Africa.
Ultimately he believes tools like DermoScreen could have a transformative effect on the lives of millions of potential patients.
What makes DermoScreen particularly fascinating is, paradoxically, that it’s not unique. The past few years have seen a wave of transformative medical and health-tracking devices and applications. And if the rumors are true, Apple is set to do what it does best: Take this existing technology and help popularize it by packaging it in a sleek, user-friendly experience.
In the same way that the arrival of the personal computer in the 1970s took computing out of the laboratory and made it a one-on-one experience, so too do today’s health care apps democratize medicine by placing the patient at the center of his or her health-tracking universe.
"The current medical system is crazy when you think about the technology that’s available to us now," says Dr. Jesse Slade Shantz, the Chief Medical Officer behind OMsignal, a $199 sensor-filled shirt that works with your iPhone to keep tabs on your stress, fitness, and general well-being. "We have GPS, we have accelerometers, we have all these ways of tracking biometrics—with the tools we have we can really start to democratize health care in a way that empowers patients. We want to turn the current sick-care system into a health care one."
In the same way that fields like facial recognition and data-driven analysis have moved from interesting hypotheticals to actionable fields in the past few years, thanks to advances in technology, so too is the subject of patient-driven health care.
"The trends are coming from every direction at once with the proliferation of data and information that can be accessed and used," says Chris Garson, cofounder and CTO of Nudge, a new mobile health app for iOS and Android which indexes data from the 100,000 health-tracking apps available in one dashboard. "We’ve always been in charge of our health tracking, in terms of deciding when to visit the doctor and such, but there’s not previously been the capabilities and additional interfaces to use to take actionable next steps. That’s what I see this whole field being about—allowing patients to control control of their own situation. This simply didn’t exist 10 years ago."
Even giant companies like Apple (with its upcoming health-tracking iWatch) and Samsung have demonstrated that they believe health-tracking to be tech’s next big wave—investing millions of dollars in the bet that it can be the next iPod in terms of mass market engagement.
Since the 1950s—and the publishing of Paul E. Meehl’s groundbreaking book Clinical vs. Statistical Prediction: A Theoretical Analysis and a Review of the Evidence—it has been known that predictions made by statistical algorithms consistently beat out those clinical predictions made by trained experts. This is the same in medicine, which is why algorithms have played an increasingly big role in the field over the past several decades.
With the addition of tools like DermoScreen—which demonstrate that even our smartphones now have the tools to make accurate clinical predictions—what does this mean for the future of the physician? Well, as with every other profession, the democratization of technology means that roles will change.
"I don’t see this replacing the need for doctors, although it definitely will change people’s relationship with their physician," says Mac Gambill, CEO of Nudge. "As the patient, today you can capture everything that happens outside a medical office and take it into your doctor’s office where—between you—you can make a more educated decision based on all the data that’s being captured."
Zouridakis, too, considers it a step change—although not one that means we’ll be waving goodbye to our local GP any time soon. "Even though several studies have shown that computer models can match and in certain cases outperform human experts, in my opinion it is too early to delegate diagnosis to machines," he says. "Until the accuracy and efficacy of a smartphone app has been validated by different clinical groups and under different conditions, deployment of these devices for home use should be done primarily for screening purposes only."
"Current technological advances and increased acceptance of tracking and monitoring systems will soon be enabling more personalized and patient-driven health care," says Peter Flach, professor of Artificial Intelligence at the U.K.'s University of Bristol, and also one of the driving forces behind residential home-quantified medical project SPHERE. "It seems clear that [giving physicians as well as individuals access to medically relevant data] can bring great advantages—as with mobile phones and the Internet, we will soon wonder how we ever managed without. On the other hand, there is no magic: We may not always get better answers, but an ability to phrase the question more precisely is already very welcome."
As with the best technological solutions, the aim is not to replace humanity—but to augment it. "[These tools] should be seen as assistive devices that can help physicians make a better decision," Zouridakis says. "Otherwise, in the hands of non-experts, when smartphone apps are used as a replacement for medical advice, they may give a false sense of security, delay diagnosis of a malignant lesion, and ultimately harm the patient."
It’s an exciting time to be involved with medicine—even just as a patient.
[Image: Flickr user Sonny Abesamis]