OP-ETC-MU-01 Chest X-ray Image Interpretation Using Deep Learning based Approach
Abstract
Background: Chest x-ray image interpretation by radiological technologists (RTs) is one of important procedure for producing high quality images. This time-consuming task typically requires expert RTs to interpret the images. Recently, deep convolution neural networks (D-CNNs) learning approach have been able to achieve expert-level performance in medical image interpretation tasks. We aimed to investigate an automatic image interpreting system to assist low experience RTs for decision.
Methods: In this study, 2,277 chest x-ray images in total were retrospectively collected from Siriraj Hospital. The datasets were spilt into training 70.3%, validation 17.5%, independent testing 12.2%. Four D-CNNs architecture (AlexNet, ResNet-18, GoogLeNet, and InceptionV3) were used to classify the images as pass or as reject. The classification performances were assessed by using the area under the receiver operating characteristics (AUROC), validation accuracy, testing accuracy, recall, and precision.
Results: The overall AUROC of the four D-CNNs architectures was 0.966 (range 0.739-0.999). The best performing model was InceptionV3 with 30 Epoch, 32 MiniBatchsize, and 0.0001 learning rate. The validation accuracy, independent testing accuracy, recall, and precision of InceptionV3 were 92.9%, 98.6%, 99.0%, and 98.6%, respectively.
Conclusion: D-CNNs can be used for chest x-ray image interpretation with very high classification performances.