A group of scientists from Stanford University has developed a neural network that, based on several X-ray images, can diagnose pneumonia, and it does it no worse than practicing radiology practitioners.
To create the algorithm, scientists created a 121-level neural network and trained it on 112,120 frontal chest fluorographs obtained from 30,805 patients. Each picture was indexed according to the existing lung diseases. During the training, the images were scanned, digitized, compressed to a size of 224 × 224 and “fed” to the neural network. Then randomly, 80% of the images were selected from the entire database and a clear adjustment of the algorithm was made. The remaining 20% were left to test the operation of the system and its debugging.
At the next stage of the tests, researchers from Stanford took 420 new shots, which the practitioners gave their conclusion. Doctors had a decent enough work experience from 4 to 28 years. Neither the radiologists nor the neural network had access to the patient’s medical records. Only pictures were available. It turned out that the neural network is not much inferior to specialists with many years of experience.
At the same time, researchers note that during the test only front-line shots were used, while in medical practice side pictures and anamnesis of the patient are also studied, which has an effect on the diagnosis. Nevertheless, similar programs will help diagnose diseases in remote regions with a lack of qualified personnel.