Moorfields and DeepMind: bringing A.I. closer to the eye clinic

Pearse Keane is a consultant ophthalmologist at Moorfields Eye Hospital, where he specialises in the treatment of retinal diseases. He is also an associate professor at the UCL Institute of Ophthalmology, and leads a clinical research group focussed on the development, evaluation and implementation of artificial intelligence (AI) in healthcare. The core principles of this research group are to put ophthalmology at the forefront of applying AI in healthcare, and for it to be an exemplar for other medical specialties.

In 2015, Pearse initiated a collaboration between Google DeepMind and Moorfields to develop the use of Artificial Intelligence (AI) in detecting early-stage wet AMD through eye scans. The system is still in the testing phase, but early results have proved extremely positive. Pearse and his team hope that by speedily identifying those in need of urgent treatment, this revolutionary diagnostic tool could help save the sight of thousands of people in the UK, and potentially millions more around the world.   

Dr Pearse Keane

How did the Google DeepMind-Moorfields collaboration come about?

DeepMind is arguably the world's leading artificial intelligence company. They received a lot of attention in 2014 when they were acquired by Google, and in July 2015, I read a profile of DeepMind in Wired UK . It described some of their work in reinforcement learning, which is teaching artificial intelligence agents to learn to play video games. What really caught my eye was their ambition, firstly to “solve intelligence”, and secondly to use it to make the world a better place. I'm a bit of an idealist, and I like to think big, so that really resonated with me. 

DeepMind was founded in London in 2010 by Mustafa Suleyman, Demis Hassabis and Shane Legg. Demis and Shane are both alumni of UCL (the university affiliated with Moorfields), while Demis and Mustafa were born and brought up in London – so they all have strong links with the UK. They were put under a lot of pressure to relocate the company to Silicon Valley, but they wanted to stay in London; they felt that they could attract global talent based here.

Mustafa Suleyman, then the head of DeepMind Applied AI, was quoted as saying, “Preventative medicine is the area I'm most excited about. There's huge potential for our methods to improve the way we make sense of data.” For me, that was a lightbulb moment, because I'd already been thinking about using AI in ophthalmology, and in particular, deep-learning for OCT [optical coherence tomography] scans. I thought, Mustafa is from North London, his mother was an NHS nurse in Barnet: he's going to know Moorfields – and he’s senior enough to make things happen. I realised he was the person I needed to contact. 

I found his profile on LinkedIn and crafted a very careful message to him, explaining that I’m a consultant at Moorfields, and that I treat retinal diseases like AMD and diabetes, which we diagnose with eyes scans called OCTs. We do thousands of them every week at Moorfields, but we’re overwhelmed with the number of patients that we need to see. Unfortunately, people are going blind, because they cannot be seen and treated quickly enough – not just at Moorfields, but across the NHS, and indeed around the world.

I suggested we work together to apply deep-learning to OCT scans, so that we could develop an algorithm that prioritises patients with the most serious sight-threatening disease and gets them in front of someone like me, a retina specialist, as soon as possible. He responded almost immediately, and a few days later I found myself talking to him in Google’s offices in King’s Cross. The timing was really good, because back then, people weren't yet exploring deep-learning and healthcare.

It’s also important to mention that Moorfields Biomedical Research Centre, and its director, Sir Peng Khaw, were supportive from the start. Without that recognition and support, there's no way the collaboration would have worked.

What have been the main achievements of the Moorfields/Google DeepMind collaboration?

When I first met Mustafa Suleyman, he asked me a lot of questions: how many OCT scans do you have in total? How many of them are from AMD patients? What's the file format, and is it proprietary – or can you make it into an open-source format. He also asked about the ethics and the information governance required to share the data, and then there were the contractual considerations; I didn’t have answers to any of those questions, so I had to go away and speak to people at Moorfields, including Sir Peng. 

In the years since then, when DeepMind has met clinicians from other specialties and asked similar questions, they often never hear from them again. The good news is that we were tenacious enough to be able to answer all the questions. Thanks to support from Sir Peng, and across all levels at Moorfields, in 2016 we were able to sign a formal research collaboration agreement and shared about a million anonymised historical OCT scans. By August 2018, we had published the first proof-of-concept results in the journal Nature Medicine , in an article that has been cited nearly 500 times – more than any other article of a similar age from this journal. 

Analysing an OCT scan of the right eye. Credit: Moorfields Eye Hospital (library shot)

Crucially, we were able to prove that our AI system was as good as a number of world-leading consultant ophthalmologists in the assessment and triaging of more than 50 different retinal diseases. The algorithm was also able to diagnose and quantify many of the common disease features on the scans. Additionally, it's not a black box, so it gives interpretability to human specialists by providing information as to why it's come to a diagnostic decision. We think that's the key to doctors and healthcare professionals actually using this technology in the real world.

What we're working on now is the translation of this algorithm into something that can be used at scale around the world, but we need to be very thoughtful about using AI for direct patient care. Although we’re very enthusiastic and excited about the potential of this technology, we have to balance that with appropriate caution.

Pearse Keane with patient Elaine Manna at the Royal Institution Christmas Lectures in 2019, alongside Hannah Fry and Cían Hughes
Pearse Keane at the Royal Institution Christmas Lectures 2019, alongside presenter Hannah Fry of the UCL Centre for Advanced Spatial Analysis.

How close is the algorithm to being ready for use in a clinical setting?

Since the initial results were published, we've been doing a global validation study, running the algorithm on datasets from the largest eye centres in every continent. Fortunately, because Moorfields has such global links, I've been able to contact people who trained in Moorfields or in the NHS, in Ghana, Sao Paulo (Brazil), Mexico City, in North America and in Europe. We plan to publish the results of these tests later this year. This is really important to minimise the potential for bias in AI. We want to prove that our algorithm will work as well on a patient with diabetic macular oedema in Ghana as it will in South America, or North America, or Moorfields.

At the same time, Google are working on the technical maturity of the algorithm. They've rewritten it so it runs in a fraction of the time and with a fraction of the computing power in a cloud-based application. That's very important, because if it's going to be used at scale around the world, it can't be on a supercomputer within DeepMind. We were able to do live demonstrations of this rewritten algorithm at Wired Health at the Crick Institute in March 2019 and at the Royal Institution Christmas Lectures in December 2019

We are currently working with collaborators in Moorfields – including Konstantinos Balaskas, Dawn Sim and Peter Thomas – to see how the algorithm could slot into rapid-access clinics within the hospital. In the longer term, we are thinking about how Moorfields can extend itself out into the community. Our starting-point is working out what would be the best experience for patients, and working backwards from there. For example, if you develop sight loss, you’ll probably go to your local optician, where increasingly you’ll be offered an OCT scan. Wouldn't it be great if the technical infrastructure existed to link those OCTs with Moorfields, where an AI system could be used to triage them? We're working with Google now to make that happen.

We’re also looking at whether we can we work with NHS organisations such as the ‘accelerated access collaborative’ (AAC)  to see how the algorithm could be used firstly around the UK, then around the world. That’s the reason we're collaborating with Google – because we want to see it used in a million people. This is going to lead to huge patient benefits, but also, Moorfields will be one of the few places in the world that will have experience of AI in healthcare. We’re showing how you go from an idea to an algorithm, and from an algorithm to application. If we can do that once, we can do it many times, across ophthalmology – and then we can share that experience across healthcare. 

Have there been any further developments since Moorfields and Google DeepMind developed the initial algorithm?

We’ve added another big piece of the AMD puzzle with the latest piece of research from the Moorfields DeepMind collaboration, recently published in Nature Medicine, with Reena Chopra (a research optometrist at Moorfields). I’m the joint senior author on this paper and Reena is the joint first author.

Many people who receive injections for wet AMD in one eye have got dry AMD in the other eye; we’ve discovered that we can use AI to predict when or if they might develop wet AMD their ‘good’ eye six months ahead of time. So far, we’ve shown proof of concept, but we're working on how we can get to a clinical trial using some sort of prophylactic treatment for those patients, which would ultimately prevent the conversion from dry to wet AMD – but it's another a multi-year project. 

AMD is a massive disease: 25% of those over the age of 60 in Europe have early or intermediate AMD, and it’s the commonest cause of blindness in the developed world – so this algorithm is going to be very important. We can use it to predict, with a high degree of certainty, some of the patients who are going to convert to wet AMD, but not all of them. The results are good, but not spectacular, so we're now working to refine it. We haven’t solved AMD using AI, but we think it's going to be a big step forward.

Close-up of an OCT machine in use by a technician performing a scan. Credit: Moorfields Eye Hospital (library shot)
Technician performing an OCT scan on a patient at Moorfields Eye Hospital
A patient at Moorfields undergoing an OCT eye scan. Credit: Moorfields Eye Hospital (library shot)

Can you tell us more about the relationship between Google-DeepMind, Moorfields and the UCL Institute of Ophthalmology?

Working with DeepMind, and now Google Health, has been amazing. Their collaboration with Moorfields is of central importance to their work in ophthalmology, which I believe will be the first medical speciality to be fundamentally transformed by the promise of AI. They are extremely smart and knowledgeable in their domain, which is computer science and engineering, and I'm a massive fan of their work, such as their breakthrough in protein-folding that featured in Nature  in January. However, what I've come to learn is that we have some pretty smart people too; we can hold our own very strongly, in terms of the clinical expertise we can provide at Moorfields, and the scientific expertise from the UCL Institute of Ophthalmology. It's very powerful when you have a seamless synergy between the clinical, the scientific and the technical. Moorfields and the BRC have been very good at being able to speak the language that allows a good to-and-fro between the two organisations.

I’ve also been struck by how idealistic Google are: they want do things right. I’ve learned a lot from them about things like data protection and about information security. And although I think the public are right to be cautious abound big AI companies like DeepMind getting access to data, I've never witnessed anything that felt like them taking a shortcut, or not being as transparent as they can be.

Have you been involved in any public and patient involvement and engagement around the collaboration, and how have you communicated to patients about use of their data? 

Although one of the central pledges of the NHS constitution is to use the data that's collected for the benefit of patients and the NHS, we were very aware from the start of the sensitivities around sharing NHS data externally, with industry partners. That’s why we decided from an early stage to be as transparent and as careful as possible. 

This project involves retrospective use of historical, anonymised OCT scans at Moorfields, so before we did any research, we wanted to tell people what we were planning to do. In 2016, we published the protocol for our research in an open-access peer-reviewed journal called F1000. Before we signed the research collaboration agreement, we made sure we had support not only of major eye disease charities (Macular Society, RNIB and Fight for Sight), but also from the Royal College of Ophthalmologists, and most importantly of all, from patients.

We’ve made sure that information about the collaboration is available on the Moorfields website, including how to opt out*, which is something we take very seriously. In the last five years, we’ve only had approximately ten people opting out of sharing their data with Google-DeepMind, although none of those people actually had OCT scans done at Moorfields. We’ve also made sure we incorporate all national patient opt-outs. 

We’ve put a lot of safeguards in place in terms of data protection. Not only do we anonymise our data to the standards specified by the Information Commissioner's Office in 2012, but we go above and beyond that, to further reduce the risk of re-identification. For example, if we only have five patients in the data with a rare disease, we don't share them, because that's a risk for re-identification. We also have contractual safeguards in place, so Google are not allowed to link the data with other datasets or attempt to re-identify the data. We also minimise the data we share, so that it's only for these specific research projects. But there’s always room for improvement – and we're always open to constructive criticism.

*On this page https://www.moorfields.nhs.uk/faq/deepmind-health-qa, see the section 'Do patients have to give their consent for their data to be used?' 
Elaine Manna, Moorfields patient, and Pearse Keane, consultant ophthalmologist
Pearse Keane with a patient at Moorfields. Credit: Moorfields Eye Hospital (library shot)

What’s been the single biggest lesson you’ve learned since you started working with Google DeepMind?

That you should think big. The endpoint of what we do as researchers should not be publishing a paper in a journal like Nature Medicine; it's a career-defining thing, but too much of research just stops there. What we need to do is get this new tool used on a million people. And if it's used on a million people, I'd like to see it used on a billion people, or a billion times. 

The other thing I’ve learned is that the people who make an impact are not necessarily different from any of us. They’re people who have a good idea, and they go with it, and they make stuff happen. These are not super-geniuses who are a different kind of creature to us. The people at DeepMind are amazing, but the people at Moorfields and the UCL Institute of Ophthalmology are pretty amazing too.