Printer Friendly

Artificial Neural Networks in Image Processing for Early Detection of Breast Cancer.

1. Introduction

Breast cancer is one of the main causes of death among women and the most frequently diagnosed non-skin cancer in women [1]. Breast cancer occurs when the cell tissues of the breast become abnormal and uncontrollably divided. These abnormal cells form large lump of tissues, which consequently becomes a tumor [2]. Such disorders could successfully be treated if they are detected early. Thus, it is of importance to have appropriate methods for screening the earliest signs of breast cancer.

Microcalcifications and masses are the earliest signs of breast cancer which can only be detected using modern techniques. Microcalcifications are clusters of calcium deposits which are very small in size and present inside the soft breast tissues [2]. Generally, detection of masses in breast tissues is more challenging compared to the detection of microcalcifications, not only due to the large variation in size and shape but also because masses often exhibit poor image contrast when using mammography [3]. The difficulty in classification of benign and malignant microcalcifications also causes a significant problem in medical image processing.

Automated classifiers may be useful for radiologists in distinguishing between benign and malignant patterns. Thus, in this paper, an artificial neural network (ANN) which can be served as an automated classifier is investigated. In medical image processing, ANNs have been applied to a variety of data-classification and pattern recognition tasks and become a promising classification tool in breast cancer [4]. ANN applications in mammography, ultrasound, and MRI and IR imaging for early detection of breast cancer are reviewed in this paper.

Image features canbe distinguished in many aspects, such as texture, color, shape, and spatial relations. They can reflect the subtle variance in many degrees. Thus, different selections of image features will result in different classification decisions. These classifications can be divided into three types: first, the method based on statistics, such as Support Vector Machine; second, the method based on rule, such as decision tree and rough sets; and third, artificial neural network [5].

In the early 1980s, there was an increment in the use of neural networks in the field of image and signal processing. The main benefit was the reduction in manipulation time due to the parallel-distributed processing behavior of neural networks [6]. Then the network had been used widely in the common image processing methods such as vector quantization, eigenvector extraction, 2D pulse code modulation, or 2D filtering [7]. The artificial neural network resembles the function of the biological neuron, and it is composed of neurons with different layers and these neurons are interconnected by numeric weights; these weights can be changed due to the learning behavior of the network to approach the optimum result. Usually in image processing applications, the number of the neurons is directly related to the number of pixels in the input image [8], and the number of layers depends on the processing steps.

For cancer detection and classification, image segmentation has been widely used. Many image segmentation methods, based on histogram features, edge detection, region growing, or pixel classification, have been trained using ANNs [9].

Although the technology related to ANN in breast cancer detection has rapidly moved forward during the last few years, there is a dearth of critical literature review on the subject which is a distinct drawback for further development of the technologies. This paper is an attempt to fulfill that vacuum in the field of image processing in the early detection of breast cancer.

2. Applications of ANNs

2.1. Mammogram. Mammography is one of the most effective methods used in hospitals and clinics for early detection of breast cancer. It has been proven effective to reduce mortality as much as by 30% [3]. The main objective of screening mammography is to early detect the cancerous tumor and remove it before the establishment of metastases [3, 10, 11]. The early signs for breast cancer are masses and microcalcification but the abnormalities and normal breast tissues are often difficult to be differentiated due to their subtle appearance and ambiguous margins [3]. Only about 3% of the required information are revealed during a mammogram where a part of suspicious region is covered with vessels and normal tissues. This situation may cause the radiologists difficult to identify a cancerous tumor. Thus, computer-aided diagnosis (CAD) has been developed to overcome the limitation of mammogram and assists the radiologists to read the mammograms much better [10]. ANN model is the most commonly used in CAD for mammography interpretation and biopsy decision making. There are two ways used in ANN to assist in mammography interpretation: first, applying classifier directly to the region of interest (ROI) image data and second, understanding the situation from the features extracted from the preprocessed image signals [12]. Figure 1 shows an example of ANN structure with multifeatured input data and multi-hidden layers [12].

Microcalcification is deposition calcium in the soft breast tissues. They are quite minute in quantity and size. It is found in a cluster or pattern of circles/lines together with extra cell activity in breast region [2]. Many researchers have developed CAD system using artificial neural network to detect microcalcification. In early 90's, research done by Dhawan et al. [13] has defined image structure features using the second-order grey-level statistics. The classification was based on implementing perceptron based 3-layer neural network and the network uses backpropagation algorithm in training which has been used successfully for a number of pattern recognition and classification applications [13, 14]. The entropy feature has significant discriminating power for classification [13]. The group of researchers further extended their research in investigating the potential of using second-order histogram textural features for their correlation with malignancy. Several architectures of neural networks were proposed to analyze the features extracted from segmented calcifications and it shows that the neural network gives good results for the classification of hard-to-diagnoses cases of mammographic microcalcification into benign and malignant categories using the selected set of features [15].

Image segmentation is a technique used in image processing. Basically, segmentation is performed on the raw image to detect small, local, and bright spots. Research done by Kevin et al. [16] has drawn significant attention on the segmentation process and neural network used. After segmentation process, ANN is performed to distinguish the segmented objects called candidates, as either microcalcifications or nonmicrocalcifications. The accuracy of the ANN is tested by having a set of labelled test images for determination of true positive (TP) and false positive (FP) detection rates. This ANN is using cascade correlation (CC) for pattern classification. It is a self-organizing ANN which runs a supervised learning algorithm. The CC ANN approach shows a promising result to detect microcalcification [16].

Not only image segmentation but also image registration techniques can be used for the breast cancer detection where ANN is performed to enhance the effectiveness of the cancer detection. In Saini and Vijay [17], Grey-Level Cooccurrence Matrix (GLCM) features are extracted and used as input to train artificial neural network based breast cancer detection system. After that, the extracted features of known and unknown mammogram images have been compared using feed-forward backpropagation and Cascade forward backpropagation ANN to distinguish the malignant and benign images. Feed-forward backpropagation network has high accuracy of 87.5% compared to Cascade forward backpropagation network with 67.8% after optimizing the number of neurons and number of layers [17].

In late 90's, the application of ANN in CAD mammography was found to have limitation in terms of data overfitting. Thus, Bayesian belief network (BBN) was compared with ANN classification method to identify the positive mass regions based on a set of computed features in CAD. The same database was used in ANN and a BNN with topologies optimization using a genetic algorithm (GA) to test the performance and robustness of the ANN and BBN. However, the result shows that there is no significant difference between using an ANN and using a BBN in CAD for mass detection if the network is optimized properly [18]. In Alayliogh and Aghdasi [11], wavelet-based image enhancement technique has been used to improve the detection of breast cancer. Input feature vectors containing spatial and spectral image were employed in neural network classifier. Microcalcification detection scheme and wavelet image enhancement have been investigated. Microcalcification detection has been performed by using a multistage algorithm comprising the image segmentation and pattern recognition to classify the microcalcifications whereas biorthogonal spline wavelets have been used in image enhancement to separate the image into frequency bands without affecting the spatial locality. The result shows that spatial and spectral feature are promising ways to detect microcalcification [11].

Besides microcalcification, masses are the most important symptoms which are difficult to be detected and distinguished accurately. A new algorithm based on two ANNs (artificial neural networks) was proposed to detect these masses automatically. ANFIS and multilayer perceptron (MLP) classifier have been used for adjustment and filtration. Suitable methods and parameters should be applied to get high detection precision and low false positive (FP) [10]. The detection process was well adjusted and improved with this proposed algorithm and the final diagnosis result showed that the CAD scheme could simultaneously achieve comparatively high detection precision and low false positive rate, even when the special masses are dealt with [10].

In mammography equipped with CAD system, the major problems developed are inconsistency and low classification accuracy. The accuracy can be improved by introducing a novel intelligent classifier which used texture information as input for the classification of normal and abnormal tissues in mammograms. Dheeba et al. [3] used neutral network as a new artificial intelligent technique for the tissue classification. CAD system based on the optimized wavelet neural network was designed and evaluated using Particle Swarm Optimization approach (PSOWNN). Optimization using heuristic algorithm is done to find appropriate hidden neurons, momentum constant, and learning rate during the training process. Thus, it will improve the classification accuracy in breast cancer detection by reducing the misclassification rate [3]. In Zhang et al. [19], backpropagation neural network (BPNN) is introduced for the classification of benign and malignant. The digitized mammogram used fuzzy detection algorithm to detect the microcalcification and suspicious area. BPNN gives a very promising result with 83.3% for the classification [19].

Any defect in breast image obtained from mammogram is highly advantageous to be detected automatically. In Lashkari [20], Gabor wavelets and ANN are used to classify normal and abnormal tissues which could increase the accuracy and save radiologist's time (Figure 2). Gabor wavelets transforms have a good attribute in image processing and computer vision. The result shows that this combination of neural networks has a good potential with 97% accuracy on unknown cases [20].

2.2. Ultrasound. Neural network (NN) also plays its role in ultrasound images in detecting breast cancer. We will first look into the capability of NN in determining and recognizing a region where malignant and benign lesions can be found. Buller [21] was one of the first who used neural network in breast cancer detection for ultrasound images. In his work, he separated the training process for benign and malignant cases by feeding the first system only with images containing benign lesion and the other with images containing only malignant lesion. He also introduces "spider web" topology which are able to produce two vectors that are further used in the classification process. The first vector represent the localized effects in a defined neighborhood and the other represents the global attributes. The technique brings high advantages as the spider web topology is sensitive to small area and hence provides better results in small area by putting more weights on the localized effects. This technique can actually be improved taking into account the extra parameters of texture and shape. As we know, malign and benign lesion has slightly different texture and shape. In Ruggierol et al. [22], the researchers implement the NN using both texture and shape parameters. Transition probability matrix and Run length matrix on the texture parameter have been used to quantify the homogeneity of images while shape parameter shows the irregularity of border of lesion. Besides, three-layer NN consisting of input, output, and hidden neurons was used. They also execute the training process by leaving one out before training and use the left out as tester. From the result, texture implementation achieved a very good result on both solid and liquid lesion.

Classification is an important technique used widely to differentiate cancerous and noncancerous breasts. Denser breast has higher risk in having cancer. Knowing this, Sahiner et al. [23] in their paper describe the importance of texture images in classification of dense breast tissue. They also introduce convolution neural network classifier to replace the backpropagation methods where the images are fed directly into the network. To measure the coarseness of texture, grey-level difference statistics and features are used, whereas spatial grey-level dependence features will be showing the element distribution. The strength of this method is that no image of tumor is fed into the network. Besides, segmentation does not need to be performed beforehand; instead a threshold value will be used. The drawback is the high computational cost which in turn makes the technique probably unsuitable for real-time operation.

As early as 1999, NN classifier has been used with autocorrelation features to classify the breast cancer as benign or malignant. Chen et al. [24] introduced a 25-input node multi-layer feed-forward NN which consists of 24-dimensional image feature vector from the ultrasound image, together with a predefined threshold of the input layer to classify the cancer. The introduced system has a relatively high accuracy in classifying malignancies and thus could help inexperienced operators to diagnose complicated ultrasound images. One of the striking advantages of NN is that it could be further optimized by supplying larger set of ultrasound images as the NN is well-trainable.

Self-organizing map (SOM) model is one type of neural networks that can be trained without supervision; it is widely used as a classifier in recent years. It expresses lower dimensional data in a simple geometry compared with the complex high dimensional data such as that in Chen et al. [25]. SOM training is fairly easy as a desired output is not necessary. SOM will automatically choose the closest input vector. The classification of benign and malignant lesion is automated. The only training needed is the data from texture analysis. However, its accuracy is slightly lower than that of the multilayer feed-forward NN introduced by Chen et al. [24] previously. With a high negative predictive value of 98%, this CAD technique of SOM could potentially avoid benign biopsies. In the same year, Chen et al. [26] came out with a HNN diagnostic system. The texture information features are extracted from four ultrasound images. The 2D normalized autocorrelation matrix for the input is modified and 2-phase HNN was used to combine the texture information from all of the ultrasound images. This study proves that the use of all 4 images leads to a more promising result than the case where images are used separately.

Bootstrap is a statistical measure that relies on random sampling with replacement. Combination of bootstrap technique with neural network helps improve the accuracy. Chen et al. [27] implement bootstrap to perform random sampling; the observation was then recomputed. This technique is useful where large amount of training data is not available, as it does not require much training data. However, the reduced amount of data should be compensated by adding image analysis component, in the bootstrap method.

The research done in 2002 used error backpropagation algorithm to train the multilayer perceptron neural network (MLPNN) and resulted in an area index of the receiver operating curve (ROC) of 0.9396 [+ or -] 0.0183 [28]. Seven morphological features have been introduced to differentiate benign from malignant breast lesions with the use of MPNN [29]. The morphological features were named as lobulation index (LI), elliptic-normalized skeleton (ENS), elliptic-normalized circumference (ENC), depth to width ratio (D: W), long axis to short axis (L: S), number of substantial protuberances and depressions (NSPD), and the size of lesion. The MPLNN is also tested with different number of hidden neurons but all results lead to a similar performance. Accuracy of the training set and test set is better than SOM and MLP NN with three inputs but on par with the accuracy of NN with 25 autocorrelation features as inputs. Different inputs selected and number of inputs may have an impact on the accuracy of the NN itself regardless of any types of NN techniques.

A year later, a research tested the NN by using only 5 morphological features which are the spiculation, branch pattern, ellipsoid shape, brightness of nodule, and the number of lobulations [30,31]. Based on these morphological features, the difference of characteristics between the benign and malignant could be seen as follows:

(i) Spiculation (benign: larger; malignant: smaller)

(ii) Branch pattern (benign: fewer; malignant: more)

(iii) Ellipsoid shape (benign: smaller; malignant: larger)

(iv) Brightness of nodule (benign: larger; malignant: smaller)

(v) Number of lobulations (benign: fewer; malignant: more)

Latest research implements the hybrid method to improve the conventional neural network method to detect the malignancies of breast cancer. Combination of the k-means cluster algorithm with the backpropagation neural network (BPNN) is proven to provide an impressive performance [32]. Figure 3 shows the result of image segmentation using ANN to extract cysts from an ultrasound breast image [32].

2.3. Thermal Imaging. Thermal imaging has been used for early identification of breast tumor and risk prediction since the 60s [33]. Thermogram is a promising cutting edge screening instrument as it can caution ladies of breast malignancy up to 10 years ahead of time [34]. Some studies have utilized several types of ANNs to manipulate and classify IR images, by taking the IR image as an input to the ANN [35]. In 2003, multispectral IR images were classified using Lagrange Constraint Neural Network (LCNN) which provides a better diagnosis for the physician [36]. Wavelet transformation is also useful with ANN for multidimensional features of the IR image, especially when it was found that the temperature of the breast is affected by many pathological factors including the mental state [37]. Asymmetry discrimination between left and right breasts can be done to produce statistical features such as mean temperature and standard deviation that could be utilized as info parameters to a backpropagation ANN [33]. In 2007, thermographic image analysis was done by implementing a special neural network that utilizes some fuzzy logic principles, called Complementary Learning Fuzzy Neural Network (CLFNN). CLFNN takes many factors into account such as family history and temperature difference of the statistical features between contralateral breasts [34]. The system is widely used in several countries at present.

2.4. MRI. MRI technique has been used widely in medical examinations, especially for cancer investigation for few decades [38]. For the diagnosis to be done properly, breast region should be extracted from other surrounding regions and tissues using image segmentation methods [39]. Figure 4 depicts such case as it was reported in the study [39].

Many neural networks models were utilized to aid MRI for enhancing the detection and the classification of the breast tumors, which can be trained with previous cases that are diagnosed by the clinicians correctly [40], or can manipulate the signal intensity or the mass characteristics (margins, shape, size, andgranularity)[41]. In 2012, multistate cellular neural networks (CNN) have been used in MR image segmentation to estimate the density of the breast regions for evaluation of the fat contents [39]. Hassanien et al. [38] introduced a hybrid model consisting of Pulse Couple Neural Network (PCNN) and Support Vector Machines (SVM) to identify breast cancer from MR images. Another hybrid algorithm was presented by ElNawasany et al. in 2014 by combining perceptron with the Scale Invariant Feature Transform (SIFT) for the same purpose [42].

3. Discussion

For the last few decades, several computer-aided diagnosis (CAD) techniques have been developed in mammographic examination of breast cancer to assist radiologist in overall image analysis and to highlight suspicious areas that need further attention. It can help radiologist to find a tumor which cannot be spotted using naked eye. As technologies keep growing, many researchers are concerned about the development of intelligent techniques which can be used in mammography to improve the classification accuracy. This artificial intelligence makes use of human skills in a more efficient manner than the conventional mathematical models do. Based on the research outcomes, ANN is proved to be a good classifier in mammography for classification of masses and microcalcifications. Implementation perceptron based three-layer neural network using backpropagation algorithm becomes a pioneer in ANN mammography. Various ANNs developed are based on the concept of increasing the true positive (TP) detection rate and decreasing the false positive (FP) and false negative (FN) detection rate for the optimum result. Implementation of wavelet in ANNs such as Particle Swarm Optimized Wavelet Neural Network (PSOWNN), biorthogonal spline wavelet ANN, second-order grey-level ANN, and Gabor wavelets ANN can improve the sensitivity and specificity which are acquired in masses and microcalcification detection.

For ultrasound applications, in the field of determining breast cancer malignancy, CAD frameworks utilizing ultrasound images are widely used due to their nonradiation properties, low cost, high availability, speedier results, and higher accuracy. An improved version of the breast cancer detection using ultrasound images has been introduced, which works on a three-dimensional ultrasound imaging that can give more in-depth information on the breast lesion compared to the conventional two-dimensional imaging. This three-dimensional imaging joins each of the two-dimensional characteristics. Furthermore, in order to handle the vulnerability nature of the ultrasound images, some methods and methodologies based on ANN have also been introduced. A majority of the research works that utilize ANN have acquired noteworthy results. Hybrid methods, which combine two ANN techniques, have recently been developed for the detection and classification of breast cancer. A two-phase hierarchical NN is also found to be promising rather than using the image analysis separately. It can also be seen that the larger the number of inputs to the ANN, the better the accuracy of the output in identification and classification of breast cancer. However, the number of hidden neurons does not seem to have a big impact on the accuracy of the system. To state which individual ANN is the best is quite subjective depending on the application and various variables to be considered. Most of the ANN techniques for the ultrasound application give good results in terms of accuracy, sensitivity, specificity, positive predictive value, and negative predictive value. Another advantage of using the ANN in determining breast lesion is that the ANN can be trained to produce better accuracy. Besides that, this ANN can be combined together with not only another ANN technique but also other signal processing techniques such as wavelet to produce better results.

For different techniques that have been utilized for the breast cancer imaging, there are different methods of detection and classification according to the input parameters of that technique. For the IR imaging, it has been shown that the detection of the breast cancer depends mainly on the statistical features of the thermal behavior of the tissues (mean, standard deviation, etc.), as well as the asymmetry differentiation of the contralateral breasts. Therefore, image classification methods based on ANN are quite fruitful in thermography.

The MRI imaging is highly recognized as a reliable technique for tumor localization as well as early detection and classification of cancer, as it is generally recommended for soft tissue recognition. Many image segmentation and 3D extraction algorithms are applied in MRI applications, and recently, many ANN classification types have been designed with many fine specifications for MRI breast imaging.

A summary of methods with NN in breast cancer detection has been given in tabulated form in Table 1.

4. Conclusion

Neural network plays an important role in detection of carcinogenic conditions in the breast. The technique acts as a stepping stone in the detection of cancer. In this review, we show that NN can be used in many medical applications which we categorized into four main medical applications that are widely used in breast cancer detection. These four medical applications include mammogram, ultrasound, and thermal and MRI imaging. This shows that NN is not restricted by the application.

In all applications, NN's main purposes were automated classification and segmentation. The types of data that need to be classified include calcification and noncalcification, benign and malignant, dense and normal breast, and tumorous and nontumorous. Neural network needs training data. Different types of data are fed into NN for training purposes. In early adaptation of NN, images of breast are being fed directly into the NN. This method will perform well only if very large databases are available. In the case of using such huge data, the concern was the storage, the time of performance, and the data availability. This flaw was realized and being improved by taking the ROI into account, which lowered the amount of dataset requirement tremendously. Researcher were then able to come out with better ideas where they now train the NN with feature vectors. In our findings, the features that can be used as training data include spiculation, branch pattern, shape, brightness of nodule, number of lobulations, margin, size of nodule, granularity, and texture. These features can be extracted manually or using image analysis technique. Introducing of features did improve the performance of NN in terms of size of training data and accuracy.

Different variation of NN can be applied as classifier. Feed-forward backpropagation NN is by far the simplest form of NN, as the name suggest, the input nodes do not have interrelation between each other, and more importantly, the units do not form a repetitive cycle or loops. Feed-forward backpropagation can only pass data from current layer to subsequent layer; hence the data is moving in one fix direction from input to output. Cascade forward NNs are somehow similar to feed-forward NNs; the only difference is that they include connections from not only the input, but also every previous layer to the subsequent layers. Convolution NN is considered as aspecial type of feed-forward neural network where there are multiple layers of small neuron collections that are able to process the portions of input image.

The trend now is going towards hybrid NN like SOM model. Combination of statistical methods such as bootstrap is being used together with NN too. SOM and bootstrap methods require lesser training data and hence are useful when we do not have many training data. Besides, people utilize SVM with NN in order to achieve a better performance. In conclusion, NN is widely used in medical image applications, creatively combined with other methods in order to achieve better accuracy, sensitivity, and also positive predictive value.

https://doi.org/10.1155/2017/2610628

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

This study has been supported by the Departments of Computer and Communication Engineering, Electrical and Electronics Engineering and Chemical and Environmental Engineering at Universiti Putra Malaysia (UPM).

References

[1] L. Hadjiiski, B. Sahiner, M. A. Helvie et al., "Breast masses: computer-aided diagnosis with serial mammograms," Radiology, vol. 240, no. 2, pp. 343-356, 2006.

[2] T. Balakumaran, I. L. A. Vennila, and C. G. Shankar, "Detection of microcalcification in mammograms using wavelet transform and fuzzy shell clustering," International Journal of Computer Science and Information Technology, vol. 7, no. 1, pp. 121-125, 2010.

[3] J. Dheeba, N. Albert Singh, and S. Tamil Selvi, "Computer-aided detection of breast cancer on mammograms: a swarm intelligence optimized wavelet neural network approach," Journal of Biomedical Informatics, vol. 49, pp. 45-52, 2014.

[4] S.-C. B. Lo, H.-P. Chan, J.-S. Lin, H. Li, M. T. Freedman, and S. K. Mun, "Artificial convolution neural network for medical image pattern recognition," Neural Networks, vol. 8, no. 7-8, pp. 1201-1214, 1995.

[5] C. Da, H. Zhang, and Y. Sang, "Brain CT image classification with deep neural networks," in Proceedings of the 18th Asia Pacific Symposium on Intelligent and Evolutionary Systems, vol. 1, pp. 653-662, 2015.

[6] W. S. Gan, "Application of neural networks to the processing of medical images," in Proceedings of the IEEE International Joint Conference on Neural Networks (IJCNN '91), vol. 1, pp. 300-306, IEEE, November 1991.

[7] E. S. Dunstone, "Image processing using an image approximation neural network," in Proceedings of the 1st International Conference on Image Processing, vol. 3, pp. 912-916, IEEE, Austin, Tex, USA, November 1994.

[8] H. S. Ranganath, G. Kuntimad, and J. L. Johnson, "Pulse coupled neural networks for image processing," in Proceedings of the IEEE Southeastcon '95. "Visualize the Future", pp. 37-43,1995.

[9] H. Tang, K. C. Tan, and Z. Yi, "Competitive neural networks for image segmentation," Studies in computational intelligence, vol. 53, pp. 129-144, 2007.

[10] W. Xu, H. Li, and P. Xu, "A new ANN-based detection algorithm of the masses in digital mammograms," in Proceedings of the IEEE International Conference on Integration Technology (ICIT '07), pp. 26-30, IEEE, Shenzhen, China, March 2007.

[11] B. A. Alayliogh and F. Aghdasi, "An artificial neural network for detecting microcalcifications in wavelet-enhanced digitised mammograms," in Proceedings of the South African Symposium on Communications and Signal Processing (COMSIG '98), pp. 127-132, September 1998.

[12] T. Ayer, Q. Chen, and E. S. Burnside, "Artificial neural networks in mammography interpretation and diagnostic decision making," Computational and Mathematical Methods in Medicine, vol. 2013, Article ID 832509, 10 pages, 2013.

[13] A. P. Dhawan, Y. S. Chitre, M. Moskowitz, and G. Eric, "Classification of mammographic microcalcification and structural features using an artificial neural network," in Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 13, pp. 1105-1106,1991.

[14] B. Widrow, R. G. Winter, and R. A. Baxter, "Layered neural nets for pattern recognition," IEEE Transactions on Acoustics, Speech, and Signal Processing, vol. 36, no. 7, pp. 1109-1118, 1988.

[15] Y. Chitre, A. P. Dhawan, and M. Moskowitz, "Artificial neural network based classification of mammographic microcalcifications using image structure and cluster features," in Digital Mammography, A. G. Gale, S. Astley, D. Dance, and A. Y. Cairns, Eds., pp. 31-40, Elsevier, 1994.

[16] S. W. Kevin, C. C. Doss, L. P. Clarke, R. A. Clark, and M. Lee, "A neural network approach to microcalcification detection," in Proceedings of the Conference Record of the IEEE Nuclear Science Symposium and Medical Imaging Conference, pp. 12731275, Orlando, Fla, USA, October 1992.

[17] S. Saini and R. Vijay, "Back propagation artificial neural network," in Proceedings of the 5th International Conference on Communication Systems and Network Technologies, pp. 11771180, Gwalior, India, April 2015.

[18] B. Zheng, Y.-H. Chang, X.-H. Wang, and W. F. Good, "Comparison of artificial neural network and Bayesian belief network in a computer-assisted diagnosis scheme for mammography," in Proceedings of the International Joint Conference on Neural Networks (IJCNN '99), vol. 6, pp. 4181-4185, July 1999, Cat. No. 99CH36339.

[19] G. Zhang, P. Yan, H. Zhao, and X. Zhang, "A computer aided diagnosis system in mammography using artificial neural networks," in Proceedings of the 1st International Conference on BioMedical Engineering and Informatics (BMEI '08), pp. 823-826, IEEE, Sanya, China, May 2008.

[20] A. Lashkari, "Full automatic micro calcification detection in mammogram images using artificial neural network and Gabor wavelets," in Proceedings of the 6th Iranian Conference on Machine Vision and Image Processing (MVIP '10), Isfahan, Iran, October 2010.

[21] D. Buller, A. Buller, P. R. Innocent, and W. Pawlak, "Determining and classifying the region of interest in ultrasonic images of the breast using neural networks," Artificial Intelligence in Medicine, vol. 8, no. 1, pp. 53-66, 1996.

[22] C. Ruggierol, F. Bagnolil, R. Sacilel, M. Calabrese, G. Rescinito, and F. Sardanelli, "Automatic recognition of malignant lesions in ultrasound images by artificial neural networks," in Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 2, pp. 872-875, Hong Kong, November 1998.

[23] B. Sahiner, H.-P. Chan, N. Petrick et al., "Classification of mass and normal breast tissue: a convolution neural network classifier with spatial domain and texture images," IEEE Transactions on Medical Imaging, vol. 15, no. 5, pp. 598-610, 1996.

[24] D.-R. Chen, R.-F. Chang, and Y.-L. Huang, "Computer-aided diagnosis applied to US of solid breast nodules by using neural networks," Radiology, vol. 213, no. 2, pp. 407-412, 1999.

[25] D.-R. Chen, R.-F. Chang, and Y.-L. Huang, "Breast cancer diagnosis using self-organizing map for sonography," Ultrasound in Medicine and Biology, vol. 26, no. 3, pp. 405-411, 2000.

[26] D.-R. Chen, R.-F. Chang, Y.-L. Huang, Y.-H. Chou, C.-M. Tiu, and P.-P. Tsai, "Texture analysis of breast tumors on sonograms," Seminars in Ultrasound, CT and MRI, vol. 21, no. 4, pp. 308-316, 2000.

[27] D.-R. Chen, W.-J. Kuo, R.-F. Chang, W. K. Moon, and C. C. Lee, "Use of the bootstrap technique with small training sets for computer-aided diagnosis in breast ultrasound," Ultrasound in Medicine and Biology, vol. 28, no. 7, pp. 897-902, 2002.

[28] D.-R. Chen, R.-F. Chang, W.-J. Kuo, M.-C. Chen, and Y.-L. Huang, "Diagnosis of breast tumors with sonographic texture analysis using wavelet transform and neural networks," Ultrasound in Medicine and Biology, vol. 28, no. 10, pp. 1301-1310, 2002.

[29] C.-M. Chen, Y.-H. Chou, K.-C. Han et al., "Breast lesions on sonograms: computer-aided diagnosis with nearly setting-independent features and artificial neural networks," Radiology, vol. 226, no. 2, pp. 504-514, 2003.

[30] S. Joo, Y. S. Yang, W. K. Moon, and H. C. Kim, "Computer-aided diagnosis of solid breast nodules: use of an artificial neural network based on multiple sonographic features," IEEE Transactions on Medical Imaging, vol. 23, no. 10, pp. 1292-1300, 2004.

[31] S. Joo, W. K. Moon, and H. C. Kim, "Computer-aided diagnosis of solid breast nodules on ultrasound with digital image processing and artificial neural network," in Proceedings of the 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC '04), vol. 2, pp. 1397-1400, IEEE, San Francisco, Calif, USA, September 2004.

[32] K. Zheng, T.-F. Wang, J.-L. Lin, and D.-Y. Li, "Recognition of breast ultrasound images using a hybrid method," in Proceedings of the IEEE/ICME International Conference on Complex Medical Engineering (CME '07), pp. 640-643, IEEE, Beijing, China, May 2007.

[33] J. Koay, C. Herry, and M. Frize, "Analysis of breast thermography with an artificial neural network," in Proceedings of the 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 2, pp. 1159-1162, IEEE, San Francisco, Calif, USA, September 2004.

[34] T. Z. Tan, C. Quek, G. S. Ng, and E. Y. K. Ng, "A novel cognitive interpretation of breast cancer thermography with complementary learning fuzzy neural memory structure," Expert Systems with Applications, vol. 33, no. 3, pp. 652-666, 2007.

[35] S. C. Fok, E. Y. K. Ng, and K. Tai, "Early detection and visualization of breast tumor with thermogram and neural network," Journal of Mechanics in Medicine and Biology, vol. 2, no. 2, pp. 185-195, 2002.

[36] H. Szu, I. Kopriva, P. Hoekstra et al., "Early tumor detection by multiple infrared unsupervised neural nets fusion," in Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 2, pp. 1133-1136, IEEE, September 2003.

[37] T. Jakubowska, B. Wiecek, M. Wysocki, C. Drews-Peszynski, and M. Strzelecki, "Classification of breast thermal images using artificial neural networks," in Proceedings of the 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEMBS 04),vol. 2, pp. 1155-1158, September 2004.

[38] A. E. Hassanien, N. El-Bendary, M. Kudelka, and V. Snasel, "Breast cancer detection and classification using support vector machines and pulse coupled neural network," in Proceedings of the Third International Conference on Intelligent Human Computer Interaction (IHCI 2011), Prague, Czech Republic, August, 2011, vol. 179 of Advances in Intelligent Systems and Computing, pp. 269-279, Springer, Berlin, Germany, 2013.

[39] G. Ertas, D. Demirgunes, and O. Erogul, "Conventional and multi-state cellular neural networks in segmenting breast region from MR images: performance comparison," in Proceedings of the International Symposium on Innovations in Intelligent Systems and Applications (INISTA '12), pp. 1-4, 2012.

[40] F. A. Cardillo, A. Starita, D. Caramella, and A. Cillotti, "A neural tool for breast cancer detection and classification in MRI," in Proceedings of the 23rd Annual International Conference of the IEEE Engineering in Medicine and Biology Society, vol. 3, pp. 2733-2736, October 2001.

[41] A. A. Tzacheva, K. Najarian, and J. P. Brockway, "Breast cancer detection in gadolinium-enhanced mr images by static region descriptors and neural networks," Journal of Magnetic Resonance Imaging, vol. 17, no. 3, pp. 337-342, 2003.

[42] A. M. ElNawasany, A. F. Ali, and M. E. Waheed, "A novel hybrid perceptron neural network algorithm for classifying breast MRI tumors," in Proceedings of the International Conference on Advanced Machine Learning Technologies and Applications, pp. 357-366, Cairo, Egypt, November 2014.

M. M. Mehdy, (1) P. Y. Ng, (1) E. F. Shair, (2) N. I. Md Saleh, (3) and C. Gomes (2)

(1) Department of Computer and Communication System Engineering, Universiti Putra Malaysia, Serdang, Selangor, Malaysia

(2) Department of Electrical and Electronics Engineering, Universiti Putra Malaysia, Serdang, Selangor, Malaysia

(3) Department of Chemical and Environmental Engineering, Universiti Putra Malaysia, Serdang, Selangor, Malaysia

Correspondence should be addressed to C. Gomes; chandima.gomes@gmail.com

Received 15 January 2017; Accepted 9 March 2017; Published 3 April 2017

Academic Editor: Po-Hsiang Tsui

Caption: FIGURE 1: Structure of atypical ANN for classification of breast tumors in mammography [12].

Caption: FIGURE 2: Results (from (a)-(c)): original image, image after first stage of NN processing, and image at second stage of NN processing using Gabor wavelets as input for mammogram image [20].

Caption: FIGURE 3: Segmentations of cysts for breast ultrasound image using ANN [32].

Caption: FIGURE 4: Multistate CNN used to segment small fatty breast and medium dense breast for MRI image [39].
TABLE 1: Summary of methods with NN in breast cancer detection.

Study                   Methods                   Input

Dheeba et               Particle Swarm          Mammogram
                        Optimized Wavelet
al. [3]                 Neural Network
                        (PSOWNN)

Xu et al. [10]          New algorithm based     Mammogram
                        on two ANNs

Alayliogh and           ANN and biorthogonal    Mammogram
Aghdasi [11]            spline wavelet

Dhawan et               (i) ANN                 Mammogram
al. [13]                (ii) second-order
                        gray-level statistics

Chitre et al.           ANN                     Mammogram
[15]

Kevin et al.            ANN                     Mammogram
[16]

Zheng et al. [18]       ANN and BBN             Mammogram

Zhang et al. [19]       Digitize module,
                        detection module,
                        feature extraction      Mammogram
                        module, neural
                        network module,
                        and classification
                        module

Lashkari [20]           ANN and Gabor           Mammogram
                        wavelets

Saini and               Image registration      Mammogram
Vijay [17]              technique and ANN

Buller et al. [21]      Spider web              Ultrasound
                        topology with NN

Ruggierol et al. [22]   Texture and shape       Ultrasound
                        parameter feeds
                        into NN

Sahiner et al. [23]     Convolutional NN        Mammogram
                        with spatial and
                        texture image

Chen et al.             Multilayer              Ultrasound
[24]                    feed-forward
                        neural network
                        (MFNN)

Chen et al.             Self-organizing         Ultrasound
[25]                    map (SOM)

Chen et al. [27]        Bootstrap with NN       Ultrasound

Chen et al.             2-phase                 Ultrasound
[26]                    Hierarchical
                        Neural Network
                        (HNN)

Chen et                 Wavelet transform       Ultrasound
al. [28]                and neural
                        network

Chen et al.             Multilayer              Ultrasound
[29]                    feed-forward
                        neural network
                        (MFNN)

Joo et al. [30]         Artificial neural       Ultrasound
                        network (ANN)

Joo et al. [31]         Digital image           Ultrasound
                        processing and
                        artificial neural
                        network

Zheng et al. [32]       Hybrid method           Ultrasound
                        (unsupervised
                        k-means cluster,
                        supervised
                        backpropagation
                        neural network
                        (BPNN))

Fok et al. [35]         ANN with 3D                 IR
                        finite element
                        analysis

Szu et al. [36]         Unsupervised
                        classification           Mid and
                        using Lagrange           long IR
                        Constraint Neural         images
                        Network (LCNN)

Jakubowska              ANN with wavelet            IR
et al. [37]             transform

Koay et al. [33]        Backpropagation             IR
                        NN

Tan et al. [34]         Fuzzy adaptive              IR
                        learning control
                        network fuzzy neural
                        network

Cardillo et al.         NN for automatic           MRI
[40]                    analysis of image
                        statistics

Tzacheva et             Evaluation of signal       MRI
al. [41]                intensity and mass
                        properties by NN

Ertas et al.            Extraction of breast       MRI
[39]                    regions by
                        conventional and
                        multistate CNNs

Hassanien               Image classification       MRI
et al. [38]             using PCNN and SVM
                        and using wavelet
                        and fuzzy sets for
                        enhancement

EINawasany et           Classifying MR             MRI
al. [42]                images by hybrid
                        perceptron NN

Study                            Purpose

Dheeba et                        Improve
                             classification
al. [3]                    accuracy in breast
                          cancer detection and
                                reducing
                            misclassification
                                  rate

Xu et al. [10]              Classification of
                                 masses

Alayliogh and               Classification of
Aghdasi [11]               microcalcification
                            cluster (MCC) and
                            image enhancement

Dhawan et                   Classification of
al. [13]                 significant and benign
                           microcalcifications

Chitre et al.               Classification of
[15]                     microcalcification into
                          benign and malignant

Kevin et al.                Classification of
[16]                       microcalcifications
                                   and
                         nonmicrocalcifications

Zheng et al. [18]         Compare performances
                             of ANN and BBN

Zhang et al. [19]           Classification of
                           microcalcification
                        clusters/suspicious areas

Lashkari [20]           Classification of breast
                          tissues to normal and
                            abnormal classes
                              automatically

Saini and                   Classification of
Vijay [17]                benign and malignant

Buller et al. [21]        Classify and separate
                          benign and malignant
                                 lesion

Ruggierol et al. [22]           Automated
                             recognition of
                            malignant lesion

Sahiner et al. [23]         Classification of
                             mass and normal
                                 breast

Chen et al.                Classify benign and
[24]                        malignant lesion

Chen et al.                 Classification of
[25]                      benign and malignant
                                 lesions

Chen et al. [27]             classification
                                of tumor

Chen et al.                   Differentiate
[26]                       between benign and
                            malignant tumors

Chen et                       Differential
al. [28]                   diagnosis of breast
                           tumors on sonograms

Chen et al.               Differentiate benign
[29]                         from malignant
                             breast lesions

Joo et al. [30]            Determining whether
                           a breast nodule is
                           benign or malignant

Joo et al. [31]             Determine breast
                            nodule malignancy

Zheng et al. [32]           Classification of
                            breast tumors as
                           benign or malignant

Fok et al. [35]             Tumor prediction

Szu et al. [36]
                             Early detection
                                of breast
                                 cancer

Jakubowska                  Discrimination of
et al. [37]                    healthy and
                           pathological cases

Koay et al. [33]             Early detection
                            of breast cancer

Tan et al. [34]            Early detection of
                            breast cancer and
                          tumor classification

Cardillo et al.            Early detection and
[40]                         classification

Tzacheva et                Automatic diagnosis
al. [41]                        of tumors

Ertas et al.                 Breast density
[39]                         evaluation and
                               abnormality
                              localization

Hassanien                     Breast cancer
et al. [38]                     detection

EINawasany et              Early detection of
al. [42]                      breast cancer

Study                   Dataset

Dheeba et               216 mammograms

al. [3]

Xu et al. [10]          30 cases and 60
                        mammograms
                        (containing 78
                        masses)

Alayliogh and           40 digitized mammogram
Aghdasi [11]

Dhawan et               5 image structure
al. [13]                features

Chitre et al.           (i) 40, 60, and
[15]                    80 training cases
                        (ii) 151,131, and
                        111 test cases

Kevin et al.            24 mammograms
[16]                    with each containing
                        at least one
                        cluster of
                        microcalcifications

Zheng et al. [18]       3 independent
                        image databases
                        and 38 features

Zhang et al. [19]       Fuzzy detection
                        algorithm (i) 30
                        digital images
                        (15 contain benign
                        cases and 15 contain
                        malignant cases)

Lashkari [20]           (i) Images of 50
                          normal and 50
                          abnormal breast
                          tissues
                        (ii) 65 cases for
                          training set and 35
                          cases for testing
                          set

Saini and               42 mammogram images
Vijay [17]              (30 benign and 12
                        malignant images)

Buller et al. [21]      25 sonograms

Ruggierol et al. [22]   (i) 41 carcinomas
                        (ii) 41 fibroadenomas
                        (iii) 41 cysts

Sahiner et al. [23]     168 mammograms

Chen et al.             140 pathological
[24]                    proved tumors (52
                        malignant, 88
                        benign)

Chen et al.             243 tumors (82
[25]                    malignant, 161
                        benign)

Chen et al. [27]        263 sonographic
                        image solid breast
                        nodules

Chen et al.             1020 images
[26]                    (4 different
                        rectangular regions
                        from the 2
                        orthogonal planes
                        of each tumor)

Chen et                 242 cases
al. [28]                (161 benign,
                        82 malignant)

Chen et al.             1st set: 160 lesions
[29]                    2nd set: 111 lesions

Joo et al. [30]         584 histologically
                        confirmed cases
                        (300 benign, 284
                        malignant)

Joo et al. [31]         584 histologically
                        confirmed cases (300
                        benign, 284
                        malignant)

Zheng et al. [32]       125 benign tumors,
                        110 malignant tumors

Fok et al. [35]         200 patients

Szu et al. [36]
                        One patient
                        with DCIS

Jakubowska              30 healthy
et al. [37]
                        10 with
                        recognized tumors

Koay et al. [33]        19 patients

Tan et al. [34]         28 healthy, 43
                        benign tumors, 7
                        cancer patients

Cardillo et al.         150 exams subdivided
[40]                    into 6 groups by
                        contrast

Tzacheva et             14 patients
al. [41]

Ertas et al.            23 women
[39]

Hassanien               70 normal cases,
et al. [38]             50 benign and
                        malign cases

EINawasany et           138 abnormal and
al. [42]                143 normal

Study                   Classifier

Dheeba et               PSOWNN

al. [3]

Xu et al. [10]          ANFIS and MLP

Alayliogh and           ANN
Aghdasi [11]

Dhawan et               (i) Three-layer
al. [13]                perceptron based ANN

Chitre et al.           ANN
[15]

Kevin et al.            Cascade
[16]                    correlation ANN
                        (CC ANN)

Zheng et al. [18]       ANN and BBN

Zhang et al. [19]       Backpropagation
                        neural network (BPNN)

Lashkari [20]           ANN and Gabor
                        wavelets

Saini and               Feed-forward
Vijay [17]              backpropagation and
                        Cascade forward
                        backpropagation
                        artificial neural
                        network

Buller et al. [21]      (i) NN classifier

Ruggierol et al. [22]   (i) NN classifier

Sahiner et al. [23]     (i) Convolution
                        NN classifier

Chen et al.             MFNN
[24]

Chen et al.             SOM
[25]

Chen et al. [27]        NN

Chen et al.             HNN
[26]

Chen et                 Multilayer
al. [28]                perceptron neural
                        network (MLPNN)

Chen et al.             MFNN
[29]

Joo et al. [30]         ANN

Joo et al. [31]
                        ANN

Zheng et al. [32]       Combination of
                        k-means with
                        BPNN

Fok et al. [35]         ANN

Szu et al. [36]
                        LCNN

Jakubowska              ANN
et al. [37]

Koay et al. [33]        Levenberg-Marquardt
                        (LM) and Resilient
                        Backpropagation (RP)

Tan et al. [34]         FALCON-AART

Cardillo et al.         NN
[40]

Tzacheva et             Feed-forward BPNN
al. [41]

Ertas et al.            CNN
[39]

Hassanien               Hybrid scheme
et al. [38]             of PCNN and SVM

EINawasany et           Perceptron
al. [42]                with SIFT

Study                   Results

Dheeba et               (i) Sensitivity
                          94.167%
al. [3]                 (ii) Specificity
                          92.105%
                        (iii) AUC 0.96853
                        (iv) Youdens index
                          0.86272
                        (v) Misclassification
                          rate 0.063291

Xu et al. [10]          (i) True positive
                          (TP) rate 93.6%
                          (73/78),
                        (ii) Number of the
                          FPs per image
                          0.63 (38/60).

Alayliogh and           (i) Sensitivity 93%,
Aghdasi [11]            (ii) FP rate
                          (MCC/image) 0.82

Dhawan et               The entropy feature
al. [13]                has significant
                        discriminating power
                        for classification

Chitre et al.           Neural network is a
[15]                    robust classifier of
                        a combination of
                        image structure and
                        binary features into
                        benign and malignant

Kevin et al.            (i) TP detection
[16]                    rate for individual

                        microcalcifications
                        is 73% and 92% for
                        nonmicrocalcifications

Zheng et al. [18]       Performance level
                          ([A.sub.z] value)
                        (i) ANN [A.sub.z] value
                          0.847 [+ or -] 0.014
                        (ii) BBN A. value
                           0.845 [+ or -] 0.011
                        (iii) Hybrid classifier
                          (ANN and BBN)
                        [A.sub.z] value increased
                          to 0.859 [+ or -] 0.01

Zhang et al. [19]       (i) Fuzzy detection rate
                          (benign 84.10% and 80.30%)
                        (ii) Classification rates
                          (feature vector,
                          n = 10 is 83.8%),
                          (feature vector
                          11 = 14 is 72.2%)

Lashkari [20]           (i) Classification
                        rate (testing
                        performance 96.3%
                        and training
                        performance 97.5%)

Saini and               Percentage accuracy
Vijay [17]              (feed-forward
                        backpropagation network is
                        87.5% and Cascade forward
                        backpropagation network is
                        67.8%)

Buller et al. [21]      (i) 69% accuracy
                          in malignant
                        (ii) 66% accuracy
                          in benign
                        (iii) 66% accuracy
                          in no lesions

Ruggierol et al. [22]   (i) 95% accuracy in
                          solid lesions
                        (ii) 92.7% accuracy in
                          liquid lesions

Sahiner et al. [23]     (i) Average true
                        positive fraction of
                        90% at false
                        positive fraction of
                        31%

Chen et al.             (i) 95% accuracy, 98%
[24]                      sensitivity
                        (ii) 93% specificity
                        (iii) 89% positive
                          predictive value
                        (iv) 99% negative
                          predictive
                          value

Chen et al.             (i) Accuracy of 85.6,
[25]                      sensitivity 97.6%
                        (ii) Specificity 79.5%
                        (iii) Positive
                          predictive value
                          70.8%
                        (iv) Negative
                          predictive value
                          98.5%

Chen et al. [27]        (i) Accuracy 87.07%,
                          sensitivity 98.35%
                        (ii) Specificity
                          79.10%
                        (iii) Positive
                          predictive value
                          81.46%
                        (iv) Negative
                          predictive value
                          94.64%

Chen et al.             4 image analyses of
[26]                    each tumor appear to
                        give more promising
                        result than if they
                        are used separately

Chen et                 (i) Receiver operating
al. [28]                  characteristic (ROC)
                          area index is
                          0.9396 [+ or -] 0.0183
                        (ii) 98.77% sensitivity,
                          81.37% specificity
                        (iii) 72.73% positive
                          predictive value
                        (iv) 99.24% negative
                          predictive value

Chen et al.             (i) 98.2% training
[29]                    accuracy
                        (ii) 95.5% testing
                        accuracy

Joo et al. [30]         (i) 100% training
                          accuracy
                        (ii) 91.4% testing set
                        (iii) 92.3% sensitivity,
                          90.7% specificity

Joo et al. [31]         (i) 91.4% accuracy,
                          92.3% sensitivity
                        (ii) 90.7% specificity

Zheng et al. [32]       (i) Recognition rate
                          (94.5% for benign,
                          93.6% for malignant)
                        (ii) 94% accuracy,
                          94.5% sensitivity
                        (iii) 93.6% specificity

Fok et al. [35]         Good detection,
                        poor sensitivity

Szu et al. [36]
                        Better sensitivity

Jakubowska              Accuracy (%)
et al. [37]               (frontal/side)
                        Raw: 90/93, PCA: 90/93
                        LDA: 93/97, NDA: 93/93
                        Accuracy (%)
                          (frontal/side)
                        Raw: 80/90, PCA: 80/90
                        LDA: 90/90, NDA: 80/100

Koay et al. [33]        Accuracy (%)
                        (RP/LM)
                        Whole:
                        95/95
                        Quadrants: 95/100

Tan et al. [34]         Cancer detection
                          (%) (TH/TDF)
                        Predicted: 95,
                          sensitivity: 100,
                          specificity: 60
                        Breast tumor
                          detection (%)
                          (TH/TDF)
                        Predicted: 84/71,
                          sensitivity:
                          33/76,
                          specificity: 91/62
                        Breast tumor
                          classification (%)
                          (TH/TDF)
                        Predicted: 88/84,
                          sensitivity: 33/33,
                          specificity: 95.5/91

Cardillo et al.         Better in specificity
[40]

Tzacheva et             90%-100% sensitivity,
al. [41]                91%-100% specificity,
                        and 91%-100%
                        accuracy

Ertas et al.            Average precision
[39]                      99.3 [+ or -] 1.8%
                        True positive volume
                          fraction
                          99.5 [+ or -] 1.3%
                        False positive
                          volume fraction
                          0.1 [+ or -] 0.2%

Hassanien               Accuracy
et al. [38]             SVM: 98%
                        Rough sets: 92%

EINawasany et           Accuracy 86.74%
al. [42]
COPYRIGHT 2017 Hindawi Limited
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2017 Gale, Cengage Learning. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Author:Mehdy, M.M.; Ng, P.Y.; Shair, E.F.; Saleh, N.I. Md; Gomes, C.
Publication:Computational and Mathematical Methods in Medicine
Article Type:Report
Date:Jan 1, 2017
Words:8236
Previous Article:A Web-Based Tool for Automatic Data Collection, Curation, and Visualization of Complex Healthcare Survey Studies including Social Network Analysis.
Next Article:Steady-State-Preserving Simulation of Genetic Regulatory Systems.
Topics:

Terms of use | Privacy policy | Copyright © 2020 Farlex, Inc. | Feedback | For webmasters