advertisement
advertisement
advertisement

This free AI reads X-rays as well as doctors

What happens when free, open-source software can diagnose us as well as doctors?

This free AI reads X-rays as well as doctors
[Screenshot: mlmed.org]

I’ve never read a chest X-ray before. I couldn’t spot pneumonia, an edema, nodules, atelectasis–nor do I know what most of these terms mean without googling them. But if I were to upload my own chest X-ray onto this free website, it could diagnose 14 diseases with 80% accuracy–in other words, about as well as a real radiologist.

advertisement
advertisement

There’s no co-pay. No prescription. And the consultation is completely private.

How could this possibly work? It’s all thanks to open-source AI that runs inside your web browser–or what happens when insurance companies and big pharma stop making the rules of healthcare, and the age of WebMD self-diagnosis is supercharged with machine learning.

The Chester AI radiology assistant was developed in work led by Joseph Paul Cohen, a postdoctoral fellow at Mila (the Quebec AI institute) and the University of Montreal. He used an NIH dataset of chest X-rays and diseases to train software to spot diseases in these scans. Though he is not a clinical doctor, Cohen is focused on the intersection of health and deep learning. Previously, Cohen created an app called BlindTool, which used machine learning to train a phone’s camera to serve as the eyes for someone with vision impairment.

The user interface isn’t exactly beautiful, with red and green sliders that depict the likelihood that you have each disease. But it is clear. At a glance, you can tell if you have a 7% or 70% chance of having pneumonia. Furthermore, with Chester AI, Cohen has created what he believes to be the first AI that can diagnose diseases in someone’s browser, through an AI run locally on your machine rather than in the cloud.

Locally run AI is not a new idea. For instance, Apple runs AI on iPhones to do things like make a shortlist of apps you might want to open next. And Google’s Pixel smartphone has all sorts of superpowers, like seeing in the dark, that are based on on-device AI.

But localized AI is new for healthcare. And in this case, the model has profound consequences. For one, it ensures privacy. The surveillance economy has created a world in which we have no idea who is getting our data, and what they might be using it for. Chester AI is like a software doctor that lives on your phone or PC, and only on your phone or PC. It doesn’t upload sensitive personal information, ever. There are self-diagnostic apps for identifying whether or not a mole might be cancer, but from Cohen’s research, all of these apps appear to send data online. (And in general, there aren’t many self-diagnostic apps in places like the App Store, because there are complicated legal restrictions around the distinction between consumer devices and medical devices.)

advertisement

One clear benefit of Chester AI is that as a local AI, it’s cheap to operate. “Because it runs in the browser we don’t have to run servers to process each image,” says Cohen. “This enables us to allow everyone to use it for free.” Amazon and Google have some of the biggest server farms in the world to run their complex cloud AIs. Chester AI’s only expense to operate is sending a small bit of code to your phone or computer. The heavy lifting is done with the hardware you’ve already bought.

Of course you would still need to get an X-ray at the doctor’s office. But with this sort of data in hand, will coders trade AI snippets on GitHub, sharing tools to spot cancers, heart problems, and more, so we can self-diagnose with more accuracy than we would trawling WebMD? Cohen isn’t so sure–and he insists that isn’t the intent of the project.

[Screenshot: mlmed.org]

“This tool is for a second diagnosis. So far our interaction with doctors has been that it is useful if they are in a hurry (like in an ER) and want to have someone run this image to confirm what they think or to help them not miss anything,” says Cohen. “For radiologists in training, this will help them to form a consistent understanding no matter who their teacher is.”

Cohen imagines his AI as an assistant to experts–which is just the sort of argument IBM has made of its Watson AI tools–but Cohen actually has no interest in taking his technology to the private sector. He wants it to remain open source and free. In places like Canada, where his research is based, Cohen believes public funds will lead to the creation of more tools like Chester, largely because they would save the healthcare industry money. As these free tools propagate, the healthcare industry would have to adapt. Every big-money healthcare vendor would need to prove its worth against a benchmark of free, publicly audited tools.

And of course, the unsaid benefit is that, even if AI doesn’t outright replace doctors, or the cost of medical visits, having free tools that can be used anywhere in the world should improve diagnosis and treatment. “We think that this can ensure a minimum level of care across the entire work,” says Cohen. “No radiologist (or even general doctor) can be worse than this tool.”

advertisement
advertisement

About the author

Mark Wilson is a senior writer at Fast Company who has written about design, technology, and culture for almost 15 years. His work has appeared at Gizmodo, Kotaku, PopMech, PopSci, Esquire, American Photo and Lucky Peach

More