Analysing X-rays for hours at a time can be tough for radiologists and may inevitably lead to human error when they develop headache and eye strain. New research shows that artificial intelligence can be used to monitor the radiologists’ gaze patterns and inform them when to ensure that another professional re-examines the X-rays.
When radiologists determine whether a person has lung cancer or pneumonia, they analyse X-ray images.
Experts can identify irregularities in an X-ray image and make a diagnosis based on this.
However, even experts can develop headaches and eye strain and make mistakes after many hours with their eyes glued to a screen. In the worst case, errors can mean that the radiologists do not detect a disease and leave the patient untreated.
In fact, radiologists not seeing abnormalities comprises 60–80% of errors in imaging, and the number of errors increases towards the end of a long workday.
New research shows that artificial intelligence can be used to support the radiologists in their work and tell them when their eyes no longer register all the details in the images and that having an extra pair of eyes to assess X-rays might therefore be a good idea.
“The aim is not to replace radiologists with artificial intelligence or identify their errors. The focus is how to use artificial intelligence to support healthcare professionals in a way that can be implemented rapidly and without risk to patients,” explains a researcher involved in the project, Bulat Ibragimov, Associate Professor, Image Analysis, Computational Modelling and Geometry, Department of Computer Science, University of Copenhagen.
The research has been published in IEEE Journal of Biomedical and Health Informatics.
Radiologists gradually examine less of an image
The researchers used eye-tracking cameras and artificial intelligence to monitor the gaze patterns of four radiologists while they examined 400 X-ray images of patients’ lungs.
The purpose was to determine whether artificial intelligence could be used to determine when the radiologists get tired from reading images and no longer examined all parts of the images equally thoroughly.
The results showed that the radiologists gradually examined less and less of the images and thus less of the patients’ lungs, and this may have major clinical consequences.
Bulat Ibragimov says that previous studies have shown that radiologists make 30% more mistakes towards the end of a long shift. One reason is probably that they no longer examine the images of the patients’ lungs as thoroughly as they should.
“If you are full of energy and examine the entire image of the lung, you are less likely to make errors. But getting tired and examining less and less of the images increases the risk of error. In addition, other things, such as personal problems or other distractions, also affect how well radiologists examine the X-ray images. The developed artificial intelligence solution will enable us to capture such moments when the rest is needed,” he adds.
Artificial intelligence should not replace radiologists but should support them
Bulat Ibragimov says that analysing radiologists’ gaze patterns and revealing whether they examine the entire X-ray image thoroughly is a novel way for assisting radiological workflow while minimising risk.
The artificial intelligence algorithm does not make the diagnosis itself so there is no risk of making a diagnostic error. Instead, the aim is to develop a warning system that can automatically tell radiologists when an image has not been examined thoroughly enough, and the computer can automatically request an additional examination of an image by another radiologist.
“We need to detect when an image is not examined properly, since this can harm the patient. Introducing an extra examination in these cases will eliminate many of the human errors that especially occur towards the end of a long shift,” says Bulat Ibragimov.
No need for 100% accuracy
Bulat Ibragimov thinks that a system to track radiologists’ gaze patterns can be implemented rapidly.
The system does not have to replace any other system, and since it does not interfere with image reading and diagnostic decision-making, it does not require a long approval procedure, which would be required for using artificial intelligence to analyse the images instead of a radiologist. In that case, researchers would have to prove that the system is close to 100% accurate.
This does not apply to the system developed by Bulat Ibragimov.
If the system misses something that the radiologist has also missed, it does not change the outcome.
Conversely, the system can help radiologists to make more correct diagnoses and fewer errors.
“This is a new way of thinking about using artificial intelligence in healthcare. The perspectives of eye tracking of radiologists and analysing the results obtained with artificial intelligence are not limited to predicting fatigue. We are now considering using it for radiological training. We can let the system track the gaze patterns of an expert radiologist first and then the students. Thus, we can determine when the students’ gaze patterns start to resemble those of an expert and are therefore ready to handle the X-rays of real patients,” explains Bulat Ibragimov.
In conclusion, he also adds that the researchers will carry out more studies, in which they not only investigate how radiologists’ eyes examine less thoroughly over time but also what they look at and what they spend less time examining.