Search

Beautiful mind: what AI means for the future of optometry

OT  speaks with research optometrist Reena Chopra and ophthalmologist Pearse Keane about a pioneering collaboration with Google’s DeepMind Health and the revolutionary potential of artificial intelligence for eye health

Reena and Pearse

After countless hours spent poring over optical coherence tomography (OCT) scans, a watershed moment occurred for Reena Chopra.

The Moorfields Eye Hospital research optometrist watched as an algorithm developed through a collaboration between clinicians at the London hospital and computer scientists at Google’s DeepMind began to automatically segment different layers of the eye.

“It takes hours for a clinician to segment an OCT scan. It was really impressive that this machine could do it in just a few minutes,” Ms Chopra highlighted.

“When we could see the model learning and improving, we thought 'OK, we can do this. It will work',” she added.

A paper published on the technology in Nature Medicine revealed that the algorithm could assess OCT scans for more than 50 retinal diseases, diagnosing conditions with a level of accuracy on par with world-leading retinal specialists.

If AI could predict future eye disease then clinicians could potentially prevent sight loss before it occurs

Research optometrist Reena Chopra

Once the model is fully refined, it has the potential to prevent avoidable sight loss in millions of patients internationally.

“We developed an artificial intelligence (AI) system that could revolutionise the way that eye diseases are diagnosed,” Ms Chopra highlighted.

“It could play an assistive role in helping clinicians to triage the patient, ensuring that patients with sight-threatening disease are seen as early as possible,” she said.

The intention is that the technology would initially be applied within the NHS’ rapid access macular clinics.

Ms Chopra was responsible for leading a group of optometrists and ophthalmologists in labelling the data that was fed to the algorithm.

“We had to meticulously delineate different features in hundreds of OCT scans – we spent a lot of time making sure that they were pixel perfect,” she shared.

The importance of high-quality data is key when it comes to AI, Ms Chopra emphasised.

“It’s like a baby learning new things. It will only learn from the data you give it. If you give it incorrect data, it will not learn anything,” she explained.

Reena
Research optometrist Reena Chopra

Big data

Because of the data-hungry nature of AI, applying the technology to OCT scans is in many ways a natural fit.

Around 1000 scans were taken each day across Moorfields Eye Hospitals’ sites in 2017.

“We have accumulated millions of diagnostic images now. With AI we can explore this big data to understand the evolution of disease or to predict if someone is at risk of developing eye disease in the future,” Ms Chopra shared.

“If AI could predict future eye disease then clinicians could potentially prevent sight loss before it occurs,” she added.

Alongside the collaboration between DeepMind and Moorfields Eye Hospital that applies AI to triaging eye disease, Ms Chopra is also leading on early scientific work that aims to predict the conversion from dry to wet age-related macular degeneration.

Separately to Ms Chopra’s work, a partnership between Moorfields Eye Hospital and Dem Dx will see a group of paediatric nurses use a platform that aids clinical decision making to triage conditions including different types of conjunctivitis, cellulitis and acute onset squints in children.

More than one million anonymised historical clinical records will be used to develop machine learning that will optimise the accuracy and relevance of the advice provided by the platform.

Trainee ophthalmologist Siegfried Wagner is working on a project that uses AI to examine retinal photographs and OCT scans for signs of Alzheimer’s disease.

It is as if we have produced this concept car that can break all the world records. The challenge is how do we take that concept car and make it into a vehicle that can be used by people all around the world

Ophthalmologist Pearse Keane

Patients at the heart

Ms Chopra highlighted the importance of patient involvement in the development of project that use AI.

“The Moorfields and DeepMind has a really good level of public engagement,” she shared.

“These projects take a lot of time and work so we should make sure that patients’ best interests are at heart,” Ms Chopra said.

As well as working on her PhD part-time, Ms Chopra spends two days a week working as an optometrist in paediatric, contact lens, refraction and injection clinics.

“You see how early treatments can make such a difference. Being a clinician helps me to consider how we should be applying AI,” she added.

Leading the way on AI

The pioneering collaboration with DeepMind began with an email sent by ophthalmologist Pearse Keane in July 2015.

He wanted to see if the AI expertise housed in a company only two tube stops away could be applied to the more than 1000 daily OCT scans taken across Moorfields Eye Hospital’s UK sites.

Dr Keane shared his belief that eye care will be the first specialty within medicine that will be “fundamentally transformed” by AI.

He observed that people outside of ophthalmology are increasingly recognising the specialty as leading the way in its use of the technology.

“I think that is for a number of reasons. One of the main reasons is that the eye is this discrete unit. We can visualise and image nearly every part of the eye at a very high resolution in a non-invasive fashion,” Dr Keane said.

Since the Nature Medicine paper was published last year, work has been undertaken to translate the proof-of-concept idea into something that can be used in practice.

A clinical validation process is being undertaken to make sure that the model works consistently in different regions and patient groups.

There are also issues of regulatory approval and making sure that the technology is technically mature.

“It is as if we have produced this concept car that can break all the world records. The challenge is how do we take that concept car and make it into a vehicle that can be used by people all around the world,” Dr Keane said.

Pearse
Ophthalmologist Pearse Keane

A refined model

In March this year, researchers were able to showcase their work refining the technology with a live demonstration of the algorithm at the Wired health event in London.

Dr Keane shared that the algorithm now performs as well as the original model with less computing power and completes the analysis in a fraction of the time.

“I think it is probably a year or two away from anything that would be commercially available but it is certainly not five years away,” Dr Keane emphasised.

Dr Keane predicts that the development of AI will “completely change” the way that diabetic retinopathy screening is performed.

Slightly left-field projects that AI is being applied to includes work by the Google Brain team to judge whether a patient is a man or a woman on the basis of retinal photographs.

“Now, you may say, what is the point of that?” Dr Keane highlighted.

“There are easier ways to tell if someone is a man or a woman. I think the reason people have been stunned by these results is that I am a retinal specialist who has been looking at retinal pictures for 15 years in obsessive detail and I cannot begin to tell you whether a patient is a man or a woman by looking at a scan,” he shared.

The same group is also investigating the use of AI on retinal photographs to predict a patient’s age within three years and their refractive error to within half a dioptre.

“You can imagine a future for optometry where people are coming in for an eye check and the range of information that they get about their health becomes an order of magnitude bigger,” Dr Keane said.

I can’t see your smartwatch or your smartphone telling you that you have got a terminal illness or telling you that you are going to go blind

Ophthalmologist Pearse Keane

Machine vs maker

Asked whether there are any human qualities that AI will be unable to replicate, Dr Keane said history has taught us to be cautious of making bold claims about the technology.

“People in the 1970s said that AI would never be able to drive a car but we are nearly there if not at the destination with self-driving cars,” he observed.

“But on the other hand, I do think there are qualities that are innately human which AI will not be able to replicate any time soon,” Dr Keane emphasised.

Among these attributes that are challenging for AI to reproduce, Dr Keane includes empathy, communication skills and creativity.

He does see AI as playing an assistive role rather than threatening to replace the role of clinicians.

“I can’t see your smartwatch or your smartphone telling you that you have got a terminal illness or telling you that you are going to go blind,” Dr Keane observed.

“I feel like the more that I interact with the health system personally and with family members, the more I realise that receiving good health care is about more than a diagnosis,” he said.

Like Ms Chopra, Dr Keane has milestones that stand out in his mind as capturing the awe-inspiring potential of AI.

One occurred at the beginning of this year, three and a half years after his initial exploratory email to DeepMind, when he sat down to have an OCT scan and was presented with AI analysis of his eye health within minutes.

“That was a spine-tingling moment. We now have something that is not just a research paper. It felt like a major step towards something that will be used in billions of people all around the world,” Dr Keane concluded.

Image credit: Jon Enoch

Advertisement