How can we teach a computer to diagnose skin conditions?

Share:
LinkedInFacebookTwitterEmail

Raluca Jalaboi and Mads Eiler Hansen are Machine Learning Engineers working with AI-assisted diagnosis in LEO Innovation Lab
With the existence of more than 2,000 skin diseases, training AI to discover patterns can improve diagnostic accuracy and patient care. Here’s how it works.

Artificial intelligence is no longer a thing of the future but an increasingly integrated part of our daily lives. From filtering out spam in your inbox, to ensuring a safe flight when you long for warmer destinations, its uses cover a variety of industries – including healthcare.

Humans are generating more data than ever before. Each time we use services from huge data companies like Google, Facebook or Amazon, we’re powering a global shift, one in which our data makes the decisions about what we do, see and desire.

The question therefore arises: How is this global technology shift affecting healthcare, and what changes should we expect to see as a result?

 

Using artificial intelligence to augment healthcare

In LEO Innovation Lab, our own exploration into AI-powered models has allowed us to identify psoriasis with an accuracy of 91%. The technology we use is called deep learning – a subfield within artificial intelligence whereby models are trained to classify vast amounts of unstructured data, such as images.

We’re using these models to help patients better understand their skin condition, and to develop AI-assisted tools that can support doctors’ decision making when diagnosing and treating skin conditions.

Our model is powered by imaging data from our skin tracking app Imagine, where people upload photos of their skin to monitor their condition over time.

So far, our technology has been focused on training the algorithm to successfully diagnose psoriasis. Our next phase of work will see us branch out into other chronic skin diseases such as eczema and rosacea.

 

Teaching the machine

The software, we’re building for Imagine, is a neural network. It’s based on the structure of the human brain, and we’re training it by showing it thousands of images – all completely anonymised and inaccessible to anyone else.
By adjusting constantly, the app’s brain starts to recognise tiny changes in the skin. Eventually, it learns to diagnose skin conditions. The more quality data provided, the better it learns that a certain combination of patterns, such as colour and thickness of the skin, should prompt one response over another.

What you effectively end up with is a mathematical structure that can detect patterns and categorise accordingly. The model doesn’t really know what psoriasis is, per say. But rather it understands which combination of patterns can be categorised as something it has learnt to label as psoriasis.

The skin tracking app Imagine employs computer vision and machine learning technology to empower anyone with a smartphone to benefit from instant feedback on their skin issue

 

Quality data for quality healthcare

The first step of creating this AI technology is to source a high volume of quality data – that is, large amounts of diverse, labelled data.

So far, we have over 70,000 labelled images from users of Imagine, each of which has been ‘diagnosed’ (labelled) by several expert dermatologists and used to engineer our deep learning model.

To date, most of the images we have are of either psoriasis or normal skin. In order to further develop our model, we now need more data on other skin diseases and skin types. Psoriasis on dark skin, for example, looks different to psoriasis on pale skin, and we need to make sure that the model develops to be an inclusive technology.

 

Explaining the decisions of the AI

Explainability is a widely debated subject within the field of artificial intelligence. As AI-algorithms become more complex, it becomes harder and less transparent for us to fully understand how they arrive at one conclusion over another.

To address this issue, we’re working on a way to force the model to recognise not only the disease based on an image of the skin but also to identify the characteristics upon which it bases its conclusions such as the discolouration or skin texture.

If you want to understand what goes on in the model, you need to force it to explain its decisions. But at  the same time, we can’t expect to understand everything the AI does. We trust an airplane when flying, although not one person on that plane is able to explain every single step of how it leaves the ground.

Of course, left unchecked, AI has the potential to grow beyond human control. But if we look at the merits of trusting some of the ‘inexplicability’ of AI decisions, we see that we’re capable of doing things that weren’t possible before. And it’s our role as engineers to ensure quality assurance.

 

How might AI be used in dermatology?

At a fundamental level, this technology allows us to make healthcare more accessible.

With AI in imaging, patients can get an immediate, deep understanding of what’s going on with their skin, as well at the actions to take. Where reaching an accurate diagnosis might have previously taken a process of trial and error over several months or years, we can reasonably expect it to happen within a few minutes with AI-powered solutions.

The idea that this tool (and others like it) could provide this kind of care has game changing potential for healthcare. However, the role of AI in dermatology is not to replace the role of the doctor but rather to eliminate doubt from diagnostics by enhancing their expertise, and to free up time to focus on other things to improve patient care.

For example, we know from interviews with dermatologists that one of the reasons why they sometimes extend consultations is to ensure that the patient understands the type of disease they have, as well as its implications. We need humans for that. What the models, we develop, can do is to automate tasks leading to this point, thereby giving more time and space for this much needed human interaction.

 

The sky’s the limit

This is an explorative time in technology, and particularly within healthcare. The rate of change is happening fast, and it’s our responsibility as innovators, designers and engineers to figure out how we can implement these changes to improve the course of patient care.

With the right interventions, technology can bring accessible, affordable healthcare to people around the globe – irrespective of their geography or income. When looking into how we can invent new things, it’s not a question of will this push humanity forward; but how far?

By Machine Learning Engineers Mads Eiler Hansen & Raluca Jalaboi

 

Deep Learning

Deep learning is a subfield of artificial intelligence that seeks to teach computers to automate tasks normally performed by humans. The computer model learns to perform classifications from a range of data inputs such as images.

Deep learning models are trained by being fed input from large sets of labelled data, that is, data that has been labelled with certain characteristics. Over time – through pattern recognition – the models learn to classify the unstructured data, such as a photo of a rash on your skin.

 

RECENT BLOGPOSTS