Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1548 2023-06-13 16:02:22 |
2 format -1 word(s) 1547 2023-06-14 03:48:59 |

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Afrin, H.; Larson, N.B.; Fatemi, M.; Alizad, A. Deep Learning in Different Ultrasound Methods for BC. Encyclopedia. Available online: https://encyclopedia.pub/entry/45516 (accessed on 17 November 2024).
Afrin H, Larson NB, Fatemi M, Alizad A. Deep Learning in Different Ultrasound Methods for BC. Encyclopedia. Available at: https://encyclopedia.pub/entry/45516. Accessed November 17, 2024.
Afrin, Humayra, Nicholas B. Larson, Mostafa Fatemi, Azra Alizad. "Deep Learning in Different Ultrasound Methods for BC" Encyclopedia, https://encyclopedia.pub/entry/45516 (accessed November 17, 2024).
Afrin, H., Larson, N.B., Fatemi, M., & Alizad, A. (2023, June 13). Deep Learning in Different Ultrasound Methods for BC. In Encyclopedia. https://encyclopedia.pub/entry/45516
Afrin, Humayra, et al. "Deep Learning in Different Ultrasound Methods for BC." Encyclopedia. Web. 13 June, 2023.
Deep Learning in Different Ultrasound Methods for BC
Edit

Breast cancer is the second-leading cause of mortality among women around the world. Ultrasound (US) is one of the noninvasive imaging modalities used to diagnose breast lesions and monitor the prognosis of cancer patients. It has the highest sensitivity for diagnosing breast masses, but it shows increased false negativity due to its high operator dependency. Underserved areas do not have sufficient US expertise to diagnose breast lesions, resulting in delayed management of breast lesions. Deep learning neural networks may have the potential to facilitate early decision-making by physicians by rapidly yet accurately diagnosing and monitoring their prognosis.

deep learning ultrasound modalities breast cancer

1. Introduction

Breast cancer is the leading cause of cancer worldwide and the second leading cause of death among women [1]. Ultrasound (US) is used in conjunction with mammography to screen for and diagnose breast mass, particularly in dense breasts. US has the potential to reduce the overall cost of breast cancer management as well as it can reduce benign open biopsies by facilitating fine needle aspiration, which is preferable because of its high sensitivity, specificity, and limited invasiveness [2][3][4][5]. The BI-RADS classification helps distinguish patients who need followup imaging from patients who require diagnostic biopsy [6] Moreover, intraoperative use can localize breast cancer in a cost-effective fashion and reduces the tumor-involved margin rate, eventually reducing the costs of additional management [7][8]. However, one of the major disadvantages of ultrasonography is high operator dependency, which increases the false-negative rate [9].
Thus, deep learning may come into play in reducing the manual workload of operators, creating a new role for doctors. Incorporation of deep learning models into ultrasound may have the potential to reduce the false-negative rate and reduce the overall cost of breast cancer management. It can help physicians and patients make prompt decisions by detecting, diagnosing, and monitoring the prognosis and treatment progress with considerable accuracy and time efficiency. This possibility has created considerable enthusiasm, but it also needs critical evaluation.
There have been several review papers published in the last decade on the role of deep learning in ultrasound for breast cancer segmentation and classification. They mostly combined deep learning models with B mode, shear wave elastography (SWE), color Doppler images, and sometimes with other imaging modalities [10][11][12][13][14][15]. Several surveys have been published on deep learning and machine learning models with B mode and SWE images, as well as multimodality images for breast cancer classification [16][17][18]. There are several concerns, such as bias in favor of the new model and whether the findings are generalizable and applicable to real-world settings. There are a considerable number of deep learning models developed to study breast cancer automatic segmentation and classification, but there is a lack of data on how they are improving the overall management of breast cancer, starting from screening to diagnosis and ultimately to survival. There are insufficient data on which modes of ultrasound are being used for deep learning algorithms as well.
Here reviews the current research trends on deep learning models in different ultrasound modalities for breast cancer management, from screening to diagnosis to prognosis, and the future challenges and directions of the application of these models.

2. Imaging Modalities Used in Breast Lesions

Various imaging modalities are used to diagnose breast masses. Self-examination, mammography, and ultrasound are usually used for screening, and if a mass is found, ultrasonography and/or MRI are usually preformed to evaluate the lesion [19]. Ultrasound has been used in various stages of breast cancer management including screening of dense breasts, diagnosis, and prognosis during chemotherapy due to its noninvasive nature, nonuse of ionizing radiation, portability, real-time nature to enable guidance for biopsies, and cost-effectiveness.

3. Computer-Aided Diagnosis and Machine Learning in Breast Ultrasound

Computer-aided diagnosis (CAD) can combine the use of machine learning and deep learning models and multidisciplinary knowledge to make a diagnosis of a breast mass [20]. Handheld US has been supplemented with automated breast US (ABUS) to reduce intraoperator variability [21]. The impact of 3D ABUS as a screening modality has been investigated for breast cancer detection in dense breasts as the CAD system substantially decreases interpretation time [21]. In the case of diagnosis, several studies have shown that 3D ABUS can help in the detection of breast lesions and the distinction of malignant from benign lesions [22], predicting the extent of breast lesion [23], monitoring response to neoadjuvant chemotherapy [24], and correlating with molecular subtypes of breast cancer [25], with a high interobserver agreeability [21][26]. A study proposed a computer-aided diagnosis system using a super-resolution algorithm and used a set of low-resolution images to reconstruct a high-resolution image to improve the texture analysis methods for breast tumor classification [27].
In machine learning, features are discerned and encoded by expert humans that may appear distinctive in the data and organized or segregated with statistical techniques according to these features [28][29]. Research on various machine learning models for the classification of benign and malignant breast masses has been published in the past decade [30]. Most recent papers used the k-nearest neighbors algorithm, support vector machine, multiple discriminant analysis, Probabilistic-ANN (Artificial Neural Network), logistic regression, random forest, decision tree, naïve Bayes and AdaBoost for diagnosis and classification of breast mass, binary logistic regression for classification of BI-RADS category 3a, and linear discriminate analysis (LDA) for analysis of axillary lymph node status in breast cancer patients [30][31][32][33][34][35].

4. What Is Deep Learning and How It Is Different

Deep learning (DL) is part of a broader family of machine learning methods that mimic the way the human brain learns. DL utilizes multiple layers to gather knowledge, and the convolution of the learned features increases in a sequential layer-wise manner [28]. Unlike machine learning, deep learning requires little to no human intervention and uses multiple layers instead of a single layer. DL algorithms have also been applied in cancer images from various modalities to make a diagnosis or classification, lesion segmentation, etc. [36]. These algorithms have been used to incorporate various clinical or histopathological data to make cancer diagnoses as well in some studies.
There are various types of convolutional neural networks. The important parts of CNNs are the input layer, output layer, convolutional layers, max-pooling layers, and fully connected layers [28][37]. The input layer should be the same as the raw or input data [28][37]. The output layer should be the same as the teaching data [28][37]. In the case of classification tasks, the unit numbers in the output layer must be the same as the category numbers in the teaching data [28][37]. The layers which are present between the input and the output layers are called hidden layers [28][37].
These multiple convolutional, fully connected, and pooling layers facilitate the learning of more features [28][37]. Usually, the convolution layer, after extracting a feature from the input image, passes to the next layer [28][37]. Convolution maintains the relationships between the pixels and results in activation [28][37]. The recurrent application of a similar filter to the input creates a map of activation, called a feature map, which facilitates revealing the intensity and location of the features recognized in the input [28][37]. The pooling layers adjust the spatial size of the activation signals to minimize the possibility of overfitting [28][37]. Spatial pooling is similar to downsampling, which adjusts the dimensionality of each map, retaining important information. Max pooling has been the commonest type of spatial pooling [28][37].
The function of a fully connected layer is to obtain the results from the convolutional/pooling layers and utilize them to classify the information such as images into labels [28][37]. Fully connected layers help connect all neurons in one layer to all neurons in the next layer through a linear transformation process [28][37]. The signal is then output via an activation function to the next layer of neurons [28][37]. The rectified linear unit (Relu) function is commonly used as the activation function, which is a nonlinear transformation [28][37]. The output layer is the final layer producing the given outputs [28][37]

5. IoT Technology in Breast Mass Diagnosis

Recently, the Industrial Internet of Things (IIoT) has emerged as one of the fastest-developing networks able to collect and exchange huge amounts of data using sensors in the medical field [38]. When it is used in the therapeutic or surgical field, it is sometimes termed the “Internet of Medical Things” (IoMT) or the “Internet of Surgical Things” (IoST), respectively [39][40][41][42]. IoMT implies a networked infrastructure of medical devices, applications, health systems, and services. It assesses the physical properties by using portable gadgets with integration into AI methods, often enabling wireless and remote devices [43][44]. This technology is improving remote patient monitoring, diagnosis of diseases, and efficient treatment via telehealth services maintained by both patients and caregivers [45]. Ragab et al. [46], developed an ensemble deep learning-based clinical decision support system for breast cancer diagnosis using ultrasound images.
Singh et al. introduced an IoT-based deep learning model to diagnose breast lesions using pathological datasets [47]. A study suggested a sensor system using temperature datasets has the potential to identify early breast mass with a wearable IoT jacket [48]. One study proposed an IoT-cloud-based health care (ICHC) system framework for breast health monitoring [49]. Peta et al. proposed an IoT-based deep max-out network to classify breast mass using a breast dataset [50]. However, most of these studies did not specify what kind of dataset they used. Image-Guided Surgery (IGS) using IoT networks may have the potential to improve surgical outcomes in surgeries where maximum precision is required in anatomical landmark tracking and instruments as well [42]. However, there is no study on IoST-based techniques involving breast US datasets.

References

  1. DeSantis, C.E.; Ma, J.; Gaudet, M.M.; Newman, L.A.; Miller, K.D.; Goding Sauer, A.; Jemal, A.; Siegel, R.L. Breast cancer statistics, 2019. CA A Cancer J. Clin. 2019, 69, 438–451.
  2. Flobbe, K.; Kessels, A.G.H.; Severens, J.L.; Beets, G.L.; de Koning, H.J.; von Meyenfeldt, M.F.; van Engelshoven, J.M.A. Costs and effects of ultrasonography in the evaluation of palpable breast masses. Int. J. Technol. Assess. Health Care 2004, 20, 440–448.
  3. Rubin, E.; Mennemeyer, S.T.; Desmond, R.A.; Urist, M.M.; Waterbor, J.; Heslin, M.J.; Bernreuter, W.K.; Dempsey, P.J.; Pile, N.S.; Rodgers, W.H. Reducing the cost of diagnosis of breast carcinoma. Cancer 2001, 91, 324–332.
  4. Boughey, J.C.; Moriarty, J.P.; Degnim, A.C.; Gregg, M.S.; Egginton, J.S.; Long, K.H. Cost Modeling of Preoperative Axillary Ultrasound and Fine-Needle Aspiration to Guide Surgery for Invasive Breast Cancer. Ann. Surg. Oncol. 2010, 17, 953–958.
  5. Chang, M.C.; Crystal, P.; Colgan, T.J. The evolving role of axillary lymph node fine-needle aspiration in the management of carcinoma of the breast. Cancer Cytopathol. 2011, 119, 328–334.
  6. Pfob, A.; Barr, R.G.; Duda, V.; Büsch, C.; Bruckner, T.; Spratte, J.; Nees, J.; Togawa, R.; Ho, C.; Fastner, S.; et al. A New Practical Decision Rule to Better Differentiate BI-RADS 3 or 4 Breast Masses on Breast Ultrasound. J. Ultrasound Med. 2022, 41, 427–436.
  7. Haloua, M.H.; Krekel, N.M.A.; Coupé, V.M.H.; Bosmans, J.E.; Lopes Cardozo, A.M.F.; Meijer, S.; van den Tol, M.P. Ultrasound-guided surgery for palpable breast cancer is cost-saving: Results of a cost-benefit analysis. Breast 2013, 22, 238–243.
  8. Konen, J.; Murphy, S.; Berkman, A.; Ahern, T.P.; Sowden, M. Intraoperative Ultrasound Guidance With an Ultrasound-Visible Clip: A Practical and Cost-effective Option for Breast Cancer Localization. J. Ultrasound Med. 2020, 39, 911–917.
  9. Ohuchi, N.; Suzuki, A.; Sobue, T.; Kawai, M.; Yamamoto, S.; Zheng, Y.-F.; Shiono, Y.N.; Saito, H.; Kuriyama, S.; Tohno, E.; et al. Sensitivity and specificity of mammography and adjunctive ultrasonography to screen for breast cancer in the Japan Strategic Anti-cancer Randomized Trial (J-START): A randomised controlled trial. Lancet 2016, 387, 341–348.
  10. Ilesanmi, A.E.; Chaumrattanakul, U.; Makhanov, S.S. Methods for the segmentation and classification of breast ultrasound images: A review. J. Ultrasound 2021, 24, 367–382.
  11. Bitencourt, A.; Daimiel Naranjo, I.; Lo Gullo, R.; Rossi Saccarelli, C.; Pinker, K. AI-enhanced breast imaging: Where are we and where are we heading? Eur. J. Radiol. 2021, 142, 109882.
  12. Tufail, A.B.; Ma, Y.K.; Kaabar, M.K.A.; Martínez, F.; Junejo, A.R.; Ullah, I.; Khan, R. Deep Learning in Cancer Diagnosis and Prognosis Prediction: A Minireview on Challenges, Recent Trends, and Future Directions. Comput. Math Methods Med. 2021, 2021, 9025470.
  13. Pesapane, F.; Rotili, A.; Agazzi, G.M.; Botta, F.; Raimondi, S.; Penco, S.; Dominelli, V.; Cremonesi, M.; Jereczek-Fossa, B.A.; Carrafiello, G.; et al. Recent Radiomics Advancements in Breast Cancer: Lessons and Pitfalls for the Next Future. Curr. Oncol. 2021, 28, 2351–2372.
  14. Pang, T.; Wong, J.H.D.; Ng, W.L.; Chan, C.S. Deep learning radiomics in breast cancer with different modalities: Overview and future. Expert Syst. Appl. 2020, 158, 113501.
  15. Ayana, G.; Dese, K.; Choe, S.-W. Transfer Learning in Breast Cancer Diagnoses via Ultrasound Imaging. Cancers 2021, 13, 738.
  16. Huang, Q.; Zhang, F.; Li, X. Machine Learning in Ultrasound Computer-Aided Diagnostic Systems: A Survey. Biomed. Res. Int. 2018, 2018, 5137904.
  17. Mridha, M.F.; Hamid, M.A.; Monowar, M.M.; Keya, A.J.; Ohi, A.Q.; Islam, M.R.; Kim, J.-M. A Comprehensive Survey on Deep-Learning-Based Breast Cancer Diagnosis. Cancers 2021, 13, 6116.
  18. Mahmood, T.; Li, J.; Pei, Y.; Akhtar, F.; Imran, A.; Rehman, K.U. A Brief Survey on Breast Cancer Diagnostic With Deep Learning Schemes Using Multi-Image Modalities. IEEE Access 2020, 8, 165779–165809.
  19. Cardoso, F.; Kyriakides, S.; Ohno, S.; Penault-Llorca, F.; Poortmans, P.; Rubio, I.T.; Zackrisson, S.; Senkus, E. Early breast cancer: ESMO Clinical Practice Guidelines for diagnosis, treatment and follow-up. Ann. Oncol. 2019, 30, 1194–1220.
  20. Chan, H.-P.; Samala, R.K.; Hadjiiski, L.M. CAD and AI for breast cancer—Recent development and challenges. Br. J. Radiol. 2020, 93, 20190580.
  21. Vourtsis, A. Three-dimensional automated breast ultrasound: Technical aspects and first results. Diagn. Interv. Imaging 2019, 100, 579–592.
  22. Wang, H.-Y.; Jiang, Y.-X.; Zhu, Q.-L.; Zhang, J.; Dai, Q.; Liu, H.; Lai, X.-J.; Sun, Q. Differentiation of benign and malignant breast lesions: A comparison between automatically generated breast volume scans and handheld ultrasound examinations. Eur. J. Radiol. 2012, 81, 3190–3200.
  23. Lin, X.; Wang, J.; Han, F.; Fu, J.; Li, A. Analysis of eighty-one cases with breast lesions using automated breast volume scanner and comparison with handheld ultrasound. Eur. J. Radiol. 2012, 81, 873–878.
  24. Wang, X.; Huo, L.; He, Y.; Fan, Z.; Wang, T.; Xie, Y.; Li, J.; Ouyang, T. Early prediction of pathological outcomes to neoadjuvant chemotherapy in breast cancer patients using automated breast ultrasound. Chin. J. Cancer Res. 2016, 28, 478–485.
  25. Zheng, F.-Y.; Lu, Q.; Huang, B.-J.; Xia, H.-S.; Yan, L.-X.; Wang, X.; Yuan, W.; Wang, W.-P. Imaging features of automated breast volume scanner: Correlation with molecular subtypes of breast cancer. Eur. J. Radiol. 2017, 86, 267–275.
  26. Kim, S.H.; Kang, B.J.; Choi, B.G.; Choi, J.J.; Lee, J.H.; Song, B.J.; Choe, B.J.; Park, S.; Kim, H. Radiologists’ Performance for Detecting Lesions and the Interobserver Variability of Automated Whole Breast Ultrasound. Korean J. Radiol. 2013, 14, 154–163.
  27. Abdel-Nasser, M.; Melendez, J.; Moreno, A.; Omer, O.A.; Puig, D. Breast tumor classification in ultrasound images using texture analysis and super-resolution methods. Eng. Appl. Artif. Intell. 2017, 59, 84–92.
  28. Fujioka, T.; Mori, M.; Kubota, K.; Oyama, J.; Yamaga, E.; Yashima, Y.; Katsuta, L.; Nomura, K.; Nara, M.; Oda, G.; et al. The Utility of Deep Learning in Breast Ultrasonic Imaging: A Review. Diagnostics 2020, 10, 1055.
  29. Chartrand, G.; Cheng, P.M.; Vorontsov, E.; Drozdzal, M.; Turcotte, S.; Pal, C.J.; Kadoury, S.; Tang, A. Deep Learning: A Primer for Radiologists. RadioGraphics 2017, 37, 2113–2131.
  30. Yassin, N.I.R.; Omran, S.; El Houby, E.M.F.; Allam, H. Machine learning techniques for breast cancer computer aided diagnosis using different image modalities: A systematic review. Comput. Methods Programs Biomed. 2018, 156, 25–45.
  31. Prabusankarlal, K.M.; Thirumoorthy, P.; Manavalan, R. Assessment of combined textural and morphological features for diagnosis of breast masses in ultrasound. Hum. Cent. Comput. Inf. Sci. 2015, 5, 12.
  32. Wu, W.-J.; Lin, S.-W.; Moon, W.K. An Artificial Immune System-Based Support Vector Machine Approach for Classifying Ultrasound Breast Tumor Images. J. Digit. Imaging 2015, 28, 576–585.
  33. Shan, J.; Alam, S.K.; Garra, B.; Zhang, Y.; Ahmed, T. Computer-Aided Diagnosis for Breast Ultrasound Using Computerized BI-RADS Features and Machine Learning Methods. Ultrasound Med. Biol. 2016, 42, 980–988.
  34. Lo, C.-M.; Moon, W.K.; Huang, C.-S.; Chen, J.-H.; Yang, M.-C.; Chang, R.-F. Intensity-Invariant Texture Analysis for Classification of BI-RADS Category 3 Breast Masses. Ultrasound Med. Biol. 2015, 41, 2039–2048.
  35. Shibusawa, M.; Nakayama, R.; Okanami, Y.; Kashikura, Y.; Imai, N.; Nakamura, T.; Kimura, H.; Yamashita, M.; Hanamura, N.; Ogawa, T. The usefulness of a computer-aided diagnosis scheme for improving the performance of clinicians to diagnose non-mass lesions on breast ultrasonographic images. J. Med. Ultrason. 2016, 43, 387–394.
  36. Madani, M.; Behzadi, M.M.; Nabavi, S. The Role of Deep Learning in Advancing Breast Cancer Detection Using Different Imaging Modalities: A Systematic Review. Cancers 2022, 14, 5334.
  37. Yasaka, K.; Akai, H.; Kunimatsu, A.; Kiryu, S.; Abe, O. Deep learning with convolutional neural network in radiology. Jpn. J. Radiol. 2018, 36, 257–272.
  38. Al-Turjman, F.; Alturjman, S. Context-Sensitive Access in Industrial Internet of Things (IIoT) Healthcare Applications. IEEE Trans. Ind. Inform. 2018, 14, 2736–2744.
  39. Parah, S.A.; Kaw, J.A.; Bellavista, P.; Loan, N.A.; Bhat, G.M.; Muhammad, K.; de Albuquerque, V.H.C. Efficient security and authentication for edge-based internet of medical things. IEEE Internet Things J. 2020, 8, 15652–15662.
  40. Dimitrov, D.V. Medical internet of things and big data in healthcare. Healthc. Inform. Res. 2016, 22, 156–163.
  41. Ogundokun, R.O.; Misra, S.; Douglas, M.; Damaševičius, R.; Maskeliūnas, R. Medical Internet-of-Things Based Breast Cancer Diagnosis Using Hyperparameter-Optimized Neural Networks. Future Internet 2022, 14, 153.
  42. Mulita, F.; Verras, G.-I.; Anagnostopoulos, C.-N.; Kotis, K. A Smarter Health through the Internet of Surgical Things. Sensors 2022, 22, 4577.
  43. Deebak, B.D.; Al-Turjman, F.; Aloqaily, M.; Alfandi, O. An authentic-based privacy preservation protocol for smart e-healthcare systems in IoT. IEEE Access 2019, 7, 135632–135649.
  44. Al-Turjman, F.; Zahmatkesh, H.; Mostarda, L. Quantifying uncertainty in internet of medical things and big-data services using intelligence and deep learning. IEEE Access 2019, 7, 115749–115759.
  45. Huang, C.; Zhang, G.; Chen, S.; Albuquerque, V.H.C.d. An Intelligent Multisampling Tensor Model for Oral Cancer Classification. IEEE Trans. Ind. Inform. 2022, 18, 7853–7861.
  46. Ragab, M.; Albukhari, A.; Alyami, J.; Mansour, R.F. Ensemble deep-learning-enabled clinical decision support system for breast cancer diagnosis and classification on ultrasound images. Biology 2022, 11, 439.
  47. Singh, S.; Srikanth, V.; Kumar, S.; Saravanan, L.; Degadwala, S.; Gupta, S. IOT Based Deep Learning framework to Diagnose Breast Cancer over Pathological Clinical Data. In Proceedings of the 2022 2nd International Conference on Innovative Practices in Technology and Management (ICIPTM), Gautam Buddha Nagar, India, 23–25 February 2022; pp. 731–735.
  48. Ashreetha, B.; Dankan, G.V.; Anandaram, H.; Nithya, B.A.; Gupta, N.; Verma, B.K. IoT Wearable Breast Temperature Assessment System. In Proceedings of the 2023 7th International Conference on Computing Methodologies and Communication (ICCMC), Erode, India, 23–25 February 2023; pp. 1236–1241.
  49. Kavitha, M.; Venkata Krishna, P. IoT-Cloud-Based Health Care System Framework to Detect Breast Abnormality. In Emerging Research in Data Engineering Systems and Computer Communications; Springer: Singapore, 2020; pp. 615–625.
  50. Peta, J.; Koppu, S. An IoT-Based Framework and Ensemble Optimized Deep Maxout Network Model for Breast Cancer Classification. Electronics 2022, 11, 4137.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , ,
View Times: 369
Revisions: 2 times (View History)
Update Date: 14 Jun 2023
1000/1000
ScholarVision Creations