A Research Paper: Machine Learning Based Radiographic Brain Images for the Purpose of Medical Data Mining and Diagnostic Decision Support
Purpose: Firstly the goal of this research work is to contrast the effectiveness of Deep Learning models over traditional Machine Learning models for Image Recognition distinguishing Neuroimaging images. Secondly, it details ways to improve the accuracy of the Convolutional Neural Network by using an optimizing method through which, all non-self-adjusting network parameters are varied to yield an optimized model.
Background: Artificial Deep Learning Interpretation is a branch of Machine Learning which involves complex learning with massive data sets. It is combine implemented with Artificial Neural Networks (ANNs) with comvulational neural network. The ANNs, in turn, are inspired by the biology of the Human Brain. ANNs mimic the functioning of biological neurons in the human brain, by a combination of hardware (physical circuitry and processors) and software (algorithms implemented with special ANN supporting frameworks), to perform human like tasks. Different Deep Learning models are used for different purposes.
Design/Methodology/Approach: The model used for image recognition tasks is called Convolutional Neural Networks (CNNs). CNNs are also inspired from the Human Brain’s ability to recognize images through its Medical Computer Vision.
Results/Findings: The research helps to validate the concepts of Machine Learning being able to successfully emulate the Human Brain for Visual Cognition. The ANN and Deep Learning models used in this research have been implemented with popular frameworks of Tensorflow and Keras with Python. The results across 170 experiments are summarized.
Conclusion and Implications: The research produced informatics standards and infrastructure in order to overcome socio-technical barriers to neuroimaging data sharing and threats to reproducibility. The research proposed the Neuroimaging Data Model (machine learning neuroimaging interpretation methods), a fundamentally new, granular data exchange standard that provides a language to communicate provenance by representing primary data, computational workflow, and derived data.