Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 2896 2022-09-29 03:01:41 |
2 Reference format revised. Meta information modification 2896 2022-09-29 03:35:44 |

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Ong, W.;  Zhu, L.;  Zhang, W.;  Kuah, T.;  Lim, D.S.W.;  Low, X.Z.;  Thian, Y.L.;  Teo, E.C.;  Tan, J.H.;  Kumar, N.; et al. Artificial Intelligence Methods for Spinal Metastasis Imaging. Encyclopedia. Available online: https://encyclopedia.pub/entry/27921 (accessed on 16 November 2024).
Ong W,  Zhu L,  Zhang W,  Kuah T,  Lim DSW,  Low XZ, et al. Artificial Intelligence Methods for Spinal Metastasis Imaging. Encyclopedia. Available at: https://encyclopedia.pub/entry/27921. Accessed November 16, 2024.
Ong, Wilson, Lei Zhu, Wenqiao Zhang, Tricia Kuah, Desmond Shi Wei Lim, Xi Zhen Low, Yee Liang Thian, Ee Chin Teo, Jiong Hao Tan, Naresh Kumar, et al. "Artificial Intelligence Methods for Spinal Metastasis Imaging" Encyclopedia, https://encyclopedia.pub/entry/27921 (accessed November 16, 2024).
Ong, W.,  Zhu, L.,  Zhang, W.,  Kuah, T.,  Lim, D.S.W.,  Low, X.Z.,  Thian, Y.L.,  Teo, E.C.,  Tan, J.H.,  Kumar, N.,  Vellayappan, B.A.,  Ooi, B.C.,  Quek, S.T.,  Makmur, A., & Hallinan, J.T.P.D. (2022, September 29). Artificial Intelligence Methods for Spinal Metastasis Imaging. In Encyclopedia. https://encyclopedia.pub/entry/27921
Ong, Wilson, et al. "Artificial Intelligence Methods for Spinal Metastasis Imaging." Encyclopedia. Web. 29 September, 2022.
Artificial Intelligence Methods for Spinal Metastasis Imaging
Edit

Spinal metastasis is the most common malignant disease of the spine, and its early diagnosis and treatment is important to prevent complications and improve quality of life. The main clinical applications of AI techniques include image processing, diagnosis, decision support, treatment assistance and prognostic outcomes. In the realm of spinal oncology, artificial intelligence technologies have achieved relatively good performance and hold immense potential to aid clinicians, including enhancing work efficiency and reducing adverse events. 

artificial intelligence machine learning deep learning spinal metastasis imaging

1. Introduction

Spinal metastasis is a malignant process along the spine that is up to 35 times more common than any other primary malignant disease along the spine [1] and represents the third most common location for metastases [2]. Spinal metastasis can tremendously impact quality of life, secondary to complications such as pain due to fractures, spinal cord compression, neurological deficits [3][4], reduced mobility, bone marrow aplasia and hypercalcemia leading to symptoms such as constipation, polyuria, polydipsia, fatigue and even cardiac arrythmias and acute renal failure [5][6]. Therefore, the timely detection, diagnosis and optimal treatment of spinal metastases is essential to reduce complications and to improve patients’ quality of life [7].
Radiological investigations play a central role in the diagnosis and treatment planning of spinal metastases. Plain radiographs are a quick and inexpensive first-line investigation, although advanced modalities such as computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography (PET) and bone scintigraphy are all superior for the detection and classification of spinal metastases [8]. Different imaging modalities have their own advantages over each other in the assessment of spinal metastasis. CT with sensitivity and specificity of 79.2% and 92.3%, respectively, for the detection of spinal metastases [9], can be used to guide interventional procedures and also provides systemic staging [10]. Compared to CT, MRI has higher sensitivity and specificity of 94.1% and 94.2%, respectively, for spinal metastasis detection [9], and is radiation-free. MRI is the modality of choice for assessing metastatic spread to the bone marrow and associated epidural soft tissue extension [11][12]. 18F FDG-PET (flurodeoxyglucose) has sensitivity and specificity of 89.8% and 63.3%, respectively, although sensitivity varies among different histologies due to their innate metabolic activity [9][13]. In bone scintigraphy, the sensitivity and specificity are 80.0% and 92.8%, respectively, and it is the most widely available technique for the study of bone metastatic disease [8][9].
Recently, preliminary Artificial Intelligence (AI) techniques have demonstrated remarkable progress in medical imaging applications, especially in the field of oncology [14]. The two most popular machine learning techniques are radiomics-based feature analysis, along with convolutional neural networks (CNN). Radiomics-based techniques require extraction of several handcrafted features, which are then selected to provide a training set for deep learning-based image classification [15]. One drawback of the technique is that the selected handcrafted features remain limited to the knowledge of the radiologist or clinician, which could reduce the accuracy of the developed algorithm [16]. Machine learning along with deep learning techniques can directly learn important imaging features for classification without the need for handcrafted feature selection. This typically involves convolutional neural networks, and these techniques have been shown in the literature to have improved prediction accuracy for lesion detection, segmentation and treatment response in oncological imaging [17][18][19].

2. Artificial Intelligence Methods for Spinal Metastasis Imaging 

2.1. Artificial Intelligence (AI)

Artificial Intelligence (AI)

Artificial intelligence (AI) is a term referring to a machine’s computational ability to perform tasks that are comparable to those executed by humans. This is done by utilising unique inputs and then generating outputs with high added value [20]. With recent advances in medical imaging and ever increasing large amounts of digital image and report data, worldwide interest in AI for medical imaging continues to increase [21]. The rationale of using AI and computer-aided diagnostic (CAD) systems was initially thought to assist clinicians or radiologists in the detection of tumours or lesions which in turn increases efficiency, improves detection and reduces error rates [22]. As a result, efforts are ongoing to enhance the diagnostic ability of AI, and enhance its efficiency so that is can be successfully translated into clinical practice [23]. With the advent of artificial neural networks, which are a class of architectures loosely based on how the human brain works [24], several computational learning models (mainly machine learning (ML) and deep learning (DL) algorithms) have been introduced and are largely responsible for the growth of AI in radiology. In general, the clinical applications of AI (Figure 1) can be broadly characterised into three categories for oncology imaging workflow: (1) detection of abnormalities; (2) characterisation of abnormalities, which includes image processing steps such as segmentation, differentiation and classification; and (3) integrated diagnostics, which include decision support for treatment decision and planning, treatment response and prognosis prediction.

/media/item_content/202209/6334f2a5c3d8fcancers-14-04025-g002.png
Figure 1. Schematic outline showing where AI implementation can optimise the radiology workflow. The workflow comprises the following steps: image acquisition, image processing, image-based tasks, reporting, and integrated diagnostics. AI can add value to the image-based clinical tasks, including the detection of abnormalities; characterisation of objects in images using segmentation, diagnosis and staging; and integrated diagnostics including decision support for treatment planning and prognosis prediction.

Machine Learning (ML)

Machine learning is a field of AI in which models are trained for prediction using known datasets, from which the machine “learns”. The developed model then applies its knowledge to perform diagnostic tasks in unknown datasets [25]. The application of ML requires collection of data inputs that have been labelled by human experts (typically radiologists) or by direct extraction of the data using several different computational methods including supervised and unsupervised learning. Supervised machine learning models rely on labelled input data to learn the relationship with output training data [26], and are often used to classify data or make predictions. On the other hand, unsupervised machine learning models learn from unlabelled raw training data to learn the relationships and patterns within the dataset and discover inherent trends within the data set [27][28]. Unsupervised models are mainly used as an efficient representation of the initial dataset (e.g., densities, distances or clustering through dataset statistical properties) and to better understand relationships or patterns within the datasets [29][30]. Such new representation can be an initial step prior to training a supervised model (e.g., identifying anomalies and outliers within the datasets), which could improve performance in the supervised model [31][32][33].

Deep Learning (DL)

Deep learning represents a subdivision of machine learning (Figure 2), and is modelled on the neuronal architecture within the brain. The technique leverages artificial neural networks, which involve several layers to solve complex medical imaging challenges [34]. The multiple layered structure enables the deep learning model or algorithm to actively learn knowledge from the imaging datasets and make predictions on unseen imaging data [35]. These deep learning techniques can provide accurate image classification (disease present/absent, or severity of disease), segmentation (pixel-based), and detection capability [36]
Figure 2. Diagram of artificial intelligence hierarchy. Machine learning lies within the field of artificial intelligence and is an area of study that enables computers to learn without explicit knowledge or programming. Within machine learning, deep learning is another area of study that enables computation of neural networks involving multiple layers. Finally, convolutional neural networks (CNN) are an important subset of deep learning, commonly applied to analyse medical images.

Radiomics

Radiomics is a relatively new branch of machine learning that involves converting medical images containing important information related to tumour features into measurable and quantifiable data [37]. This information can then aid clinicians in the assessment of tumours by providing additional data about tumour behaviour and pathophysiology beyond that of current subjective visual interpretation (inferable by human eyes) [38][39], such as tumour subtyping and grading [40]. Combined with clinical and qualitative imaging data, radiomics has been shown to guide and improve medical decision making [41], and can be used to aid disease prediction, provide prognostic information, along with treatment response assessment [40]. In general, the workflow for deriving a radiomics model can be divided into several steps (Figure 3): data selection (input), radiological imaging evaluation and segmentation, image feature extraction in the regions of interest (ROIs) and exploratory analysis followed by modelling [37]. Depending on the type of imaging modality, the acquisition, technical specifications, software, segmentation of the ROIs, image feature extraction and structure of the predictive algorithm are all different and subject to several factors [42]. Machine learning methods including random decision forest, an ensemble learning method for classifying data using decision trees, can then be performed to validate and further evaluate the classification accuracy of the set of predictors [43]. These can then be applied in a clinical setting to potentially improve the diagnostic accuracy and prediction of survival post-treatment [44][45].
Figure 3. Diagram showing the general framework and main steps for radiomics, namely data selection (input), medical imaging evaluation and segmentation, feature extraction in the regions of interest (ROIs), exploratory analysis and modelling.
There are two key radiomics techniques, namely handcrafted-feature based and deep learning-based analysis [46]. Firstly, handcrafted-feature radiomics involves extraction of features from an area of interest (typically segmented). These features can be placed into groups based on shape [47], histogram criteria (first order statistics) [48], textural-based criteria (second-order statistics) [22] and other higher order statistical criteria [49]. Following this step, machine learning models can be developed to provide clinical predictions, including survival/prognostic information based on the handcrafted-features [44][50][51]. The models are also assessed on validation datasets to check their efficiency and sensitivity.
In contrast to handcrafted-feature radiomics, deep learning techniques rely on convolutional neural networks (CNN) or other architectures [52] to identify the most pertinent radiological imaging features without relying on prior feature descriptions. CNNs provide automated extraction of the most important features from the radiological imaging data using a cascading process, which can then be used for pattern recognition and training [53]. The generated dominant imaging features can undergo further processing, or exit the neural network and be used for machine learning model generation using algorithms similar to the feature-based radiomics method before validation. The main drawback of deep learning-based radiomics is the requirement for much larger training datasets, since feature extraction is required as part of the initial process compared to feature-based radiomics where the features are manually selected for analysis [54]. With recent advances in AI, this limitation can be circumvented through transfer learning, which is a technique that uses neural networks that were pre-trained for another separate but closely related purpose [55]. As such, by leveraging on the network’s prior knowledge, transfer learning reduces computational demand and the amount of training data required, but can still produce reliable performance.
Radiomics techniques have transformed the outlook of quantitative medical imaging research. Radiomics could provide rapid, comprehensive characterisation of tumours at minimal cost, which would act as an initial screen to determine the need for further clinical or genomic testing [56].

2.2. Artificial Intelligence Methods for Spinal Metastasis Imaging

Detection of Spinal Metastases

Early detection and diagnosis of spinal metastases plays a key role in clinical practice. This will determine the stage of disease for the patient, and has the potential to alter the treatment regimen [57]. Metastatic spinal disease is associated with increased morbidity, and more than half of these patients will require radiotherapy or invasive intervention for complications, such as spinal cord or nerve root compression [58]. Hence, early diagnosis and treatment before permanent neurologic and functional deficits occur is essential for a favourable prognosis [59][60][61].
Manual detection of spinal metastasis through various imaging modalities is time consuming, tedious and often challenging with imaging features overlapping with many other pathologies. It is widely recognised that automated lesion detection could improve radiologist sensitivity for detecting osseous metastases, with computer-aided detection (CAD) software systems and artificial intelligence models proving to be as effective or even superior to manual radiologist detection [62][63][64]. Computer-assisted detection of spinal metastases was first studied on CT by O’Connor et al. [64] in 2007 for the detection of osteolytic spinal metastases. This paved the way for further studies using CAD, focusing on other subtypes of spinal metastasis such as osteoblastic or mixed type lesions [65], and other imaging modalities. Subsequently, with the recent advances in artificial intelligence in medical imaging [66][67][68], there were substantial improvements in the detection of spinal metastases with the aid of deep learning and convolutional neural networks. This has resulted in improvement in the accuracy of computer-assisted automated detection of spinal metastases across various imaging modalities with significant reduction in false positive and negative rates [69][70][71][72].

Differentiating Spinal Metastases from Other Pathological Conditions

Machine learning has been applied in several studies to help distinguish between spinal metastases and other pathology. This was first done by identifying key radiomics features in vertebral metastases [73], and incorporating this information with various machine learning models. For example, Liu et al. [74] and Xiong X et al. [75] utilised MRI-based radiomics to differentiate between spinal metastases and multiple myeloma, based on conventional T1-weighted (T1W) and fat-suppression T2-weighted (T2W) MR sequences. They incorporated the radiomics models using various machine learning algorithms such as Support-Vector Machine (SVM), Random Forest (RF), K-Nearest Neighbour (KNN), Naïve Bayes (NB) using 10-fold cross validation, Artificial Neural Networks (ANN) and Logistic Regression Classifier to predict the likelihood of spinal metastases. The radiomics model from Xiong X et al. used features from T2WI images, and achieved accuracy, sensitivity, and specificity of 81.5%, 87.9% and 79.0%, respectively, in their validation cohort. As for Liu et al., their model with 10-EPV (events per independent variable) showed good performance in distinguishing multiple myeloma from spinal metastases with an AUC of 0.85.

Pre-Treatment Evaluation

Prediction of prognosis is a paradigm in oncological treatment. In patients with vertebral metastases, the ability to predict treatment response may help clinicians provide the most appropriate treatment with the best clinical outcome for the patient, avoid delayed transition to another treatment and prevent exposing patients to unnecessary treatment-related side effects. Shi YJ et al. [76] studied the value of MRI-based radiomics in predicting the treatment response of chemotherapy in a small group of breast cancer patients with vertebral metastases. Their radiomics model was effective in predicting progressive vs non-progressive disease with an area under the curve (AUC) of up to 0.91. This method could be extrapolated in future studies to predict the treatment response of spinal metastases and other primary tumours.

Applications of deep learning models goes beyond tumour detection and differentiation, and they have the ability to automatically generate meaningful parameters from MRI and other modalities. Hallinan et al. [77] developed a deep learning model for automated classification of metastatic epidural disease and/or spinal cord compression on MRI using the Bilsky classification. The model showed almost perfect agreement when compared to specialist readers on internal and external datasets with kappas of 0.92–0.98, p < 0.001 and 0.94–0.95, p < 0.001, respectively, for dichotomous Bilsky classification (low versus high grade). Accurate, reproducible classification of metastatic epidural spinal cord compression will enable clinicians to decide on initial radiotherapy versus surgical intervention [78].

Segmentation refers to delineation or volume extraction of a lesion or organ based on image analysis. In clinical practice, manual or semi-manual segmentation techniques are being applied to provide further value to CT and MRI studies. However, these techniques are subjective, operator-dependent and very time-consuming which limits their adoption. Automatic segmentation of spinal metastases using deep learning models has been shown to be as accurate as expert annotations in both MRI and CT [71]. Hille G et. al. [79] showed that automated vertebral metastasis segmentation on MRI using deep convolutional neural networks (U-net like architecture) were almost as accurate as expert annotation. Their automated segmentation solution achieved a Dice–Sørensen coefficient (DSC) of up to 0.78 and mean sensitivity rates up to 78.9% on par with inter-reader variability DSC of 0.79. Potentially, these models will not only reduce the need for time-consuming manual segmentation of spinal metastases, but also support stereotactic body radiotherapy planning, and improve the performance [80][81] and treatment outcome of minimally invasive interventions for spinal metastasis such as radiofrequency ablation [82]. In respect to radiotherapy, precise automated tumour contours will improve treatment planning, reduce segmentation times and reduce the radiation dose to the surrounding organs at risk, including the spinal cord. In recent years, various image segmentation techniques have been proposed, resulting in more accurate and efficient image segmentation for clinical diagnosis and treatment [83][84][85][86].

Radiogenomics, the combination of “Radiomics” and “Genomics”, refers to the use of imaging features or surrogates to determine genomic signatures and advanced biomarkers in tumours. These biomarkers can then be used for clinical management decisions, including prognostic, diagnostic and predictive precision of tumour subtypes [87]. The workflow of a radiogenomics study can be commonly classified into five different stages (Figure 4): (1) image acquisition and pre-processing, (2) feature extraction and selection from both the medical imaging and genotype, (3) association of radiomics and genomics features, (4) data analysis using machine learning models and (5) final radiogenomics outcome model [88].
Figure 4. Diagram showing a five-stage radiogenomics pipeline including data acquisition (radiological imaging) and pre-processing, feature extraction and selection, subsequent association of radiomics techniques and genomics, analysis of data and model development and, finally, radiogenomics outcomes.

Post-Treatment Evaluation

Zhong et al. [89] created an MRI-based radiomics nomogram that was shown to be clinically useful in discriminating between cervical spine osteoradionecrosis and metastases, with an AUC of 0.73 on the training set and 0.72 in the validation set.

Pseudo-progression is a post-treatment phenomenon involving an increase in the target tumour volume (usually without any worsening symptoms), which then demonstrates interval stability or reduction in volume on repeat imaging. It occurs in approximately 14 to 18% of those with vertebral metastases treated with stereotactic body radiotherapy [90][91]. The differentiation of pseudo-progression from true progression is challenging on imaging even with many studies suggesting some differentiating factors [91][92], such as location of involvement, e.g., purely vertebral body involvement with pseudo compared to involvement of the epidural space with true progression. Artificial intelligence has already shown utility in aiding the differentiation of pseudo from true progression in brain imaging.

References

  1. Mundy, G.R. Metastasis to bone: Causes, consequences and therapeutic opportunities. Nat. Rev. Cancer 2002, 2, 584–593.
  2. Witham, T.F.; Khavkin, Y.A.; Gallia, G.L.; Wolinsky, J.P.; Gokaslan, Z.L. Surgery insight: Current management of epidural spinal cord compression from metastatic spine disease. Nat. Clin. Pract. Neurol. 2006, 2, 87–94, quiz 116.
  3. Klimo, P., Jr.; Schmidt, M.H. Surgical management of spinal metastases. Oncologist 2004, 9, 188–196.
  4. Coleman, R.E. Metastatic bone disease: Clinical features, pathophysiology and treatment strategies. Cancer Treat Rev. 2001, 27, 165–176.
  5. Cuccurullo, V.; Cascini, G.L.; Tamburrini, O.; Rotondo, A.; Mansi, L. Bone metastases radiopharmaceuticals: An overview. Curr. Radiopharm. 2013, 6, 41–47.
  6. Cecchini, M.G.; Wetterwald, A.; Pluijm, G.v.d.; Thalmann, G.N. Molecular and Biological Mechanisms of Bone Metastasis. EAU Update Ser. 2005, 3, 214–226.
  7. Yu, H.H.; Tsai, Y.Y.; Hoffe, S.E. Overview of diagnosis and management of metastatic disease to bone. Cancer Control 2012, 19, 84–91.
  8. O’Sullivan, G.J.; Carty, F.L.; Cronin, C.G. Imaging of bone metastasis: An update. World J. Radiol. 2015, 7, 202–211.
  9. Liu, T.; Wang, S.; Liu, H.; Meng, B.; Zhou, F.; He, F.; Shi, X.; Yang, H. Detection of vertebral metastases: A meta-analysis comparing MRI, CT, PET, BS and BS with SPECT. J. Cancer Res. Clin. Oncol. 2017, 143, 457–465.
  10. Wallace, A.N.; Greenwood, T.J.; Jennings, J.W. Use of Imaging in the Management of Metastatic Spine Disease With Percutaneous Ablation and Vertebral Augmentation. AJR Am. J. Roentgenol. 2015, 205, 434–441.
  11. Moynagh, M.R.; Colleran, G.C.; Tavernaraki, K.; Eustace, S.J.; Kavanagh, E.C. Whole-body magnetic resonance imaging: Assessment of skeletal metastases. Semin. Musculoskelet. Radiol. 2010, 14, 22–36.
  12. Schiff, D.; O’Neill, B.P.; Wang, C.H.; O’Fallon, J.R. Neuroimaging and treatment implications of patients with multiple epidural spinal metastases. Cancer 1998, 83, 1593–1601.
  13. Talbot, J.N.; Paycha, F.; Balogova, S. Diagnosis of bone metastasis: Recent comparative studies of imaging modalities. Q. J. Nucl. Med. Mol. Imaging 2011, 55, 374–410.
  14. Tran, K.A.; Kondrashova, O.; Bradley, A.; Williams, E.D.; Pearson, J.V.; Waddell, N. Deep learning in cancer diagnosis, prognosis and treatment selection. Genome Med. 2021, 13, 152.
  15. Liu, Z.; Wang, S.; Dong, D.; Wei, J.; Fang, C.; Zhou, X.; Sun, K.; Li, L.; Li, B.; Wang, M.; et al. The Applications of Radiomics in Precision Diagnosis and Treatment of Oncology: Opportunities and Challenges. Theranostics 2019, 9, 1303–1322.
  16. Parekh, V.S.; Jacobs, M.A. Deep learning and radiomics in precision medicine. Expert Rev. Precis Med. Drug Dev. 2019, 4, 59–72.
  17. Esteva, A.; Kuprel, B.; Novoa, R.A.; Ko, J.; Swetter, S.M.; Blau, H.M.; Thrun, S. Dermatologist-level classification of skin cancer with deep neural networks. Nature 2017, 542, 115–118.
  18. Kermany, D.S.; Goldbaum, M.; Cai, W.; Valentim, C.C.S.; Liang, H.; Baxter, S.L.; McKeown, A.; Yang, G.; Wu, X.; Yan, F.; et al. Identifying Medical Diagnoses and Treatable Diseases by Image-Based Deep Learning. Cell 2018, 172, 1122–1131 e1129.
  19. De Fauw, J.; Ledsam, J.R.; Romera-Paredes, B.; Nikolov, S.; Tomasev, N.; Blackwell, S.; Askham, H.; Glorot, X.; O’Donoghue, B.; Visentin, D.; et al. Clinically applicable deep learning for diagnosis and referral in retinal disease. Nat. Med. 2018, 24, 1342–1350.
  20. Rajkomar, A.; Dean, J.; Kohane, I. Machine Learning in Medicine. N. Engl. J. Med. 2019, 380, 1347–1358.
  21. Thrall, J.H.; Li, X.; Li, Q.; Cruz, C.; Do, S.; Dreyer, K.; Brink, J. Artificial Intelligence and Machine Learning in Radiology: Opportunities, Challenges, Pitfalls, and Criteria for Success. J. Am. Coll Radiol. 2018, 15, 504–508.
  22. Hosny, A.; Parmar, C.; Quackenbush, J.; Schwartz, L.H.; Aerts, H. Artificial intelligence in radiology. Nat. Rev. Cancer 2018, 18, 500–510.
  23. Nagoev, Z.V.; Sundukov, Z.A.; Pshenokova, I.A.; Denisenko, V.A. Architecture of CAD for distributed artificial intelligence based on self-organizing neuro-cognitive architectures. News Kabard.–Balkar Sci. Cent. RAS 2020, 2, 40–47.
  24. Kriegeskorte, N. Deep Neural Networks: A New Framework for Modeling Biological Vision and Brain Information Processing. Annu. Rev. Vis. Sci. 2015, 1, 417–446.
  25. Erickson, B.J.; Korfiatis, P.; Akkus, Z.; Kline, T.L. Machine Learning for Medical Imaging. Radiographics 2017, 37, 505–515.
  26. Zhu, X.; Goldberg, A.B. Introduction to Semi-Supervised Learning. Synth. Lect. Artif. Intell. Mach. Learn. 2009, 3, 1–130.
  27. Sidey-Gibbons, J.A.M.; Sidey-Gibbons, C.J. Machine learning in medicine: A practical introduction. BMC Med. Res. Methodol. 2019, 19, 64.
  28. Cao, B.; Araujo, A.; Sim, J. Unifying Deep Local and Global Features for Image Search. In Proceedings of the Computer Vision–ECCV 2020, Glasgow, UK, 23–28 August 2020; pp. 726–743.
  29. Bengio, Y. Learning Deep Architectures for AI. Found. Trends Mach. Learn. 2009, 2, 1–127.
  30. Montagnon, E.; Cerny, M.; Cadrin-Chênevert, A.; Hamilton, V.; Derennes, T.; Ilinca, A.; Vandenbroucke-Menu, F.; Turcotte, S.; Kadoury, S.; Tang, A. Deep learning workflow in radiology: A primer. Insights Imaging 2020, 11, 22.
  31. Li, H.; Galperin-Aizenberg, M.; Pryma, D.; Simone, C.B., 2nd; Fan, Y. Unsupervised machine learning of radiomic features for predicting treatment response and overall survival of early stage non-small cell lung cancer patients treated with stereotactic body radiation therapy. Radiother. Oncol. 2018, 129, 218–226.
  32. Alaverdyan, Z.; Jung, J.; Bouet, R.; Lartizien, C. Regularized siamese neural network for unsupervised outlier detection on brain multiparametric magnetic resonance imaging: Application to epilepsy lesion screening. Med. Image Anal. 2020, 60, 101618.
  33. Tlusty, T.; Amit, G.; Ben-Ari, R. Unsupervised clustering of mammograms for outlier detection and breast density estimation. In Proceedings of the 2018 24th International Conference on Pattern Recognition (ICPR), Beijing, China, 20–24 August 2018; pp. 3808–3813.
  34. Zaharchuk, G.; Gong, E.; Wintermark, M.; Rubin, D.; Langlotz, C.P. Deep Learning in Neuroradiology. AJNR Am. J. Neuroradiol. 2018, 39, 1776–1784.
  35. Kaka, H.; Zhang, E.; Khan, N. Artificial intelligence and deep learning in neuroradiology: Exploring the new frontier. Can. Assoc. Radiol. J. 2021, 72, 35–44.
  36. Cheng, P.M.; Montagnon, E.; Yamashita, R.; Pan, I.; Cadrin-Chênevert, A.; Romero, F.P.; Chartrand, G.; Kadoury, S.; Tang, A. Deep Learning: An Update for Radiologists. RadioGraphics 2021, 41, 1427–1445.
  37. van Timmeren, J.E.; Cester, D.; Tanadini-Lang, S.; Alkadhi, H.; Baessler, B. Radiomics in medical imaging—“how-to” guide and critical reflection. Insights Imaging 2020, 11, 91.
  38. Faiella, E.; Santucci, D.; Calabrese, A.; Russo, F.; Vadala, G.; Zobel, B.B.; Soda, P.; Iannello, G.; de Felice, C.; Denaro, V. Artificial Intelligence in Bone Metastases: An MRI and CT Imaging Review. Int. J. Environ. Res. Public Health 2022, 19, 1880.
  39. Mannil, M.; von Spiczak, J.; Manka, R.; Alkadhi, H. Texture Analysis and Machine Learning for Detecting Myocardial Infarction in Noncontrast Low-Dose Computed Tomography: Unveiling the Invisible. Investig. Radiol. 2018, 53, 338–343.
  40. Aerts, H.J. The Potential of Radiomic-Based Phenotyping in Precision Medicine: A Review. JAMA Oncol. 2016, 2, 1636–1642.
  41. Lambin, P.; Rios-Velazquez, E.; Leijenaar, R.; Carvalho, S.; van Stiphout, R.G.; Granton, P.; Zegers, C.M.; Gillies, R.; Boellard, R.; Dekker, A.; et al. Radiomics: Extracting more information from medical images using advanced feature analysis. Eur. J. Cancer 2012, 48, 441–446.
  42. Valladares, A.; Beyer, T.; Rausch, I. Physical imaging phantoms for simulation of tumor heterogeneity in PET, CT, and MRI: An overview of existing designs. Med. Phys. 2020, 47, 2023–2037.
  43. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32.
  44. Zhang, Y.; Oikonomou, A.; Wong, A.; Haider, M.A.; Khalvati, F. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer. Sci. Rep. 2017, 7, 46349.
  45. Jia, T.-Y.; Xiong, J.-F.; Li, X.-Y.; Yu, W.; Xu, Z.-Y.; Cai, X.-W.; Ma, J.-C.; Ren, Y.-C.; Larsson, R.; Zhang, J.; et al. Identifying EGFR mutations in lung adenocarcinoma by noninvasive imaging using radiomics features and random forest modeling. Eur. Radiol. 2019, 29, 4742–4750.
  46. Rogers, W.; Thulasi Seetha, S.; Refaee, T.A.G.; Lieverse, R.I.Y.; Granzier, R.W.Y.; Ibrahim, A.; Keek, S.A.; Sanduleanu, S.; Primakov, S.P.; Beuque, M.P.L.; et al. Radiomics: From qualitative to quantitative imaging. Br. J. Radiol. 2020, 93, 20190948.
  47. Rizzo, S.; Botta, F.; Raimondi, S.; Origgi, D.; Fanciullo, C.; Morganti, A.G.; Bellomi, M. Radiomics: The facts and the challenges of image analysis. Eur. Radiol. Exp. 2018, 2, 36.
  48. Lambin, P.; Leijenaar, R.T.H.; Deist, T.M.; Peerlings, J.; de Jong, E.E.C.; van Timmeren, J.; Sanduleanu, S.; Larue, R.; Even, A.J.G.; Jochems, A.; et al. Radiomics: The bridge between medical imaging and personalized medicine. Nat. Rev. Clin. Oncol. 2017, 14, 749–762.
  49. Zhou, M.; Scott, J.; Chaudhury, B.; Hall, L.; Goldgof, D.; Yeom, K.W.; Iv, M.; Ou, Y.; Kalpathy-Cramer, J.; Napel, S.; et al. Radiomics in Brain Tumor: Image Assessment, Quantitative Feature Descriptors, and Machine-Learning Approaches. AJNR Am. J. Neuroradiol. 2018, 39, 208–216.
  50. Chen, C.; Zheng, A.; Ou, X.; Wang, J.; Ma, X. Comparison of Radiomics-Based Machine-Learning Classifiers in Diagnosis of Glioblastoma From Primary Central Nervous System Lymphoma. Front. Oncol. 2020, 10, 1151.
  51. Cha, Y.J.; Jang, W.I.; Kim, M.S.; Yoo, H.J.; Paik, E.K.; Jeong, H.K.; Youn, S.M. Prediction of Response to Stereotactic Radiosurgery for Brain Metastases Using Convolutional Neural Networks. Anticancer Res. 2018, 38, 5437–5445.
  52. Guo, Y.; Liu, Y.; Oerlemans, A.; Lao, S.; Wu, S.; Lew, M.S. Deep learning for visual understanding: A review. Neurocomputing 2016, 187, 27–48.
  53. Ciresan, D.C.; Meier, U.; Masci, J.; Gambardella, L.M.; Schmidhuber, J. Flexible, high performance convolutional neural networks for image classification. In Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence, Barcelona, Spain, 16–22 July 2011.
  54. Papadimitroulas, P.; Brocki, L.; Christopher Chung, N.; Marchadour, W.; Vermet, F.; Gaubert, L.; Eleftheriadis, V.; Plachouris, D.; Visvikis, D.; Kagadis, G.C.; et al. Artificial intelligence: Deep learning in oncological radiomics and challenges of interpretability and data harmonization. Phys. Med. 2021, 83, 108–121.
  55. Ueda, D.; Shimazaki, A.; Miki, Y. Technical and clinical overview of deep learning in radiology. Jpn. J. Radiol. 2019, 37, 15–33.
  56. Haider, S.P.; Burtness, B.; Yarbrough, W.G.; Payabvash, S. Applications of radiomics in precision diagnosis, prognostication and treatment planning of head and neck squamous cell carcinomas. Cancers Head Neck 2020, 5, 6.
  57. Curtin, M.; Piggott, R.P.; Murphy, E.P.; Munigangaiah, S.; Baker, J.F.; McCabe, J.P.; Devitt, A. Spinal Metastatic Disease: A Review of the Role of the Multidisciplinary Team. Orthop. Surg. 2017, 9, 145–151.
  58. Tomita, K.; Kawahara, N.; Kobayashi, T.; Yoshida, A.; Murakami, H.; Akamaru, T. Surgical strategy for spinal metastases. Spine (Phila Pa 1976) 2001, 26, 298–306.
  59. Clemons, M.; Gelmon, K.A.; Pritchard, K.I.; Paterson, A.H. Bone-targeted agents and skeletal-related events in breast cancer patients with bone metastases: The state of the art. Curr. Oncol. 2012, 19, 259–268.
  60. Hamaoka, T.; Madewell, J.E.; Podoloff, D.A.; Hortobagyi, G.N.; Ueno, N.T. Bone imaging in metastatic breast cancer. J. Clin. Oncol. 2004, 22, 2942–2953.
  61. Bilsky, M.H.; Lis, E.; Raizer, J.; Lee, H.; Boland, P. The diagnosis and treatment of metastatic spinal tumor. Oncologist 1999, 4, 459–469.
  62. Burns, J.E.; Yao, J.; Wiese, T.S.; Munoz, H.E.; Jones, E.C.; Summers, R.M. Automated detection of sclerotic metastases in the thoracolumbar spine at CT. Radiology 2013, 268, 69–78.
  63. Hammon, M.; Dankerl, P.; Tsymbal, A.; Wels, M.; Kelm, M.; May, M.; Suehling, M.; Uder, M.; Cavallaro, A. Automatic detection of lytic and blastic thoracolumbar spine metastases on computed tomography. Eur. Radiol. 2013, 23, 1862–1870.
  64. O’Connor, S.D.; Yao, J.; Summers, R.M. Lytic metastases in thoracolumbar spine: Computer-aided detection at CT–preliminary study. Radiology 2007, 242, 811–816.
  65. Wiese, T.; Yao, J.; Burns, J.E.; Summers, R.M. Detection of sclerotic bone metastases in the spine using watershed algorithm and graph cut. In Proceedings of the Medical Imaging 2012: Computer-Aided Diagnosis, San Diego, CA, USA, 4–9 February 2012.
  66. Aneja, S.; Chang, E.; Omuro, A. Applications of artificial intelligence in neuro-oncology. Curr. Opin. Neurol 2019, 32, 850–856.
  67. Duong, M.T.; Rauschecker, A.M.; Mohan, S. Diverse Applications of Artificial Intelligence in Neuroradiology. Neuroimaging Clin. N. Am. 2020, 30, 505–516.
  68. Muthukrishnan, N.; Maleki, F.; Ovens, K.; Reinhold, C.; Forghani, B.; Forghani, R. Brief History of Artificial Intelligence. Neuroimaging Clin. N. Am. 2020, 30, 393–399.
  69. Wang, J.; Fang, Z.; Lang, N.; Yuan, H.; Su, M.Y.; Baldi, P. A multi-resolution approach for spinal metastasis detection using deep Siamese neural networks. Comput. Biol. Med. 2017, 84, 137–146.
  70. Fan, X.; Zhang, X.; Zhang, Z.; Jiang, Y. Deep Learning-Based Identification of Spinal Metastasis in Lung Cancer Using Spectral CT Images. Sci. Program. 2021, 2021, 2779390.
  71. Chang, C.Y.; Buckless, C.; Yeh, K.J.; Torriani, M. Automated detection and segmentation of sclerotic spinal lesions on body CTs using a deep convolutional neural network. Skelet. Radiol. 2022, 51, 391–399.
  72. Roth, H.Y.J.; Lu, L.; Stieger, J.; Burns, J.; Summers, R.M. Detection of Sclerotic Spine Metastases via Random Aggregation of Deep Convolutional Neural Network Classifications. Lect. Notes Comput. Vis. Biomech. 2014, 20, 3–12.
  73. Filograna, L.; Lenkowicz, J.; Cellini, F.; Dinapoli, N.; Manfrida, S.; Magarelli, N.; Leone, A.; Colosimo, C.; Valentini, V. Identification of the most significant magnetic resonance imaging (MRI) radiomic features in oncological patients with vertebral bone marrow metastatic disease: A feasibility study. Radiol. Med. 2019, 124, 50–57.
  74. Liu, J.; Guo, W.; Zeng, P.; Geng, Y.; Liu, Y.; Ouyang, H.; Lang, N.; Yuan, H. Vertebral MRI-based radiomics model to differentiate multiple myeloma from metastases: Influence of features number on logistic regression model performance. Eur. Radiol. 2022, 32, 572–581.
  75. Xiong, X.; Wang, J.; Hu, S.; Dai, Y.; Zhang, Y.; Hu, C. Differentiating Between Multiple Myeloma and Metastasis Subtypes of Lumbar Vertebra Lesions Using Machine Learning-Based Radiomics. Front. Oncol. 2021, 11, 601699.
  76. Shi, Y.J.; Zhu, H.T.; Li, X.T.; Zhang, X.Y.; Wei, Y.Y.; Yan, S.; Sun, Y.S. Radiomics analysis based on multiple parameters MR imaging in the spine: Predicting treatment response of osteolytic bone metastases to chemotherapy in breast cancer patients. Magn. Reson. Imaging 2022, 92, 10–18.
  77. Hallinan, J.T.P.D.; Zhu, L.; Zhang, W.; Lim, D.S.W.; Baskar, S.; Low, X.Z.; Yeong, K.Y.; Teo, E.C.; Kumarakulasinghe, N.B.; Yap, Q.V.; et al. Deep Learning Model for Classifying Metastatic Epidural Spinal Cord Compression on MRI. Front. Oncol. 2022, 12.
  78. Perry, J.; Chambers, A.; Laperriere, N. Systematic Review of the Diagnosis and Management of Malignant Extradural Spinal Cord Compression: The Cancer Care Ontario Practice Guidelines Initiative‘s Neuro-Oncology Disease Site Group. J. Clin. Oncol. 2005, 23, 2028–2037.
  79. Hille, G.; Steffen, J.; Dünnwald, M.; Becker, M.; Saalfeld, S.; Tönnies, K. Spinal Metastases Segmentation in MR Imaging using Deep Convolutional Neural Networks. arXiv 2020, arXiv:2001.05834.
  80. Boon, I.S.; Au Yong, T.P.T.; Boon, C.S. Assessing the Role of Artificial Intelligence (AI) in Clinical Oncology: Utility of Machine Learning in Radiotherapy Target Volume Delineation. Medicines 2018, 5, 131.
  81. Li, Q.; Xu, Y.; Chen, Z.; Liu, D.; Feng, S.T.; Law, M.; Ye, Y.; Huang, B. Tumor Segmentation in Contrast-Enhanced Magnetic Resonance Imaging for Nasopharyngeal Carcinoma: Deep Learning with Convolutional Neural Network. BioMed Res. Int. 2018, 2018, 9128527.
  82. Arends, S.R.S.; Savenije, M.H.F.; Eppinga, W.S.C.; van der Velden, J.M.; van den Berg, C.A.T.; Verhoeff, J.J.C. Clinical utility of convolutional neural networks for treatment planning in radiotherapy for spinal metastases. Phys. Imaging Radiat. Oncol. 2022, 21, 42–47.
  83. Wong, J.; Fong, A.; McVicar, N.; Smith, S.; Giambattista, J.; Wells, D.; Kolbeck, C.; Giambattista, J.; Gondara, L.; Alexander, A. Comparing deep learning-based auto-segmentation of organs at risk and clinical target volumes to expert inter-observer variability in radiotherapy planning. Radiother. Oncol. 2020, 144, 152–158.
  84. Wang, Z.; Chang, Y.; Peng, Z.; Lv, Y.; Shi, W.; Wang, F.; Pei, X.; Xu, X.G. Evaluation of deep learning-based auto-segmentation algorithms for delineating clinical target volume and organs at risk involving data for 125 cervical cancer patients. J. Appl. Clin. Med. Phys. 2020, 21, 272–279.
  85. Men, K.; Dai, J.; Li, Y. Automatic segmentation of the clinical target volume and organs at risk in the planning CT for rectal cancer using deep dilated convolutional neural networks. Med. Phys. 2017, 44, 6377–6389.
  86. Vrtovec, T.; Močnik, D.; Strojan, P.; Pernuš, F.; Ibragimov, B. Auto-segmentation of organs at risk for head and neck radiotherapy planning: From atlas-based to deep learning methods. Med. Phys. 2020, 47, e929–e950.
  87. Saxena, S.; Jena, B.; Gupta, N.; Das, S.; Sarmah, D.; Bhattacharya, P.; Nath, T.; Paul, S.; Fouda, M.M.; Kalra, M.; et al. Role of Artificial Intelligence in Radiogenomics for Cancers in the Era of Precision Medicine. Cancers 2022, 14, 2860.
  88. Fathi Kazerooni, A.; Bagley, S.J.; Akbari, H.; Saxena, S.; Bagheri, S.; Guo, J.; Chawla, S.; Nabavizadeh, A.; Mohan, S.; Bakas, S.; et al. Applications of Radiomics and Radiogenomics in High-Grade Gliomas in the Era of Precision Medicine. Cancers 2021, 13, 5921.
  89. Zhong, X.; Li, L.; Jiang, H.; Yin, J.; Lu, B.; Han, W.; Li, J.; Zhang, J. Cervical spine osteoradionecrosis or bone metastasis after radiotherapy for nasopharyngeal carcinoma? The MRI-based radiomics for characterization. BMC Med. Imaging 2020, 20, 104.
  90. Amini, B.; Beaman, C.B.; Madewell, J.E.; Allen, P.K.; Rhines, L.D.; Tatsui, C.E.; Tannir, N.M.; Li, J.; Brown, P.D.; Ghia, A.J. Osseous Pseudoprogression in Vertebral Bodies Treated with Stereotactic Radiosurgery: A Secondary Analysis of Prospective Phase I/II Clinical Trials. AJNR Am. J. Neuroradiol. 2016, 37, 387–392.
  91. Bahig, H.; Simard, D.; Letourneau, L.; Wong, P.; Roberge, D.; Filion, E.; Donath, D.; Sahgal, A.; Masucci, L. A Study of Pseudoprogression After Spine Stereotactic Body Radiation Therapy. Int. J. Radiat. Oncol. Biol. Phys. 2016, 96, 848–856.
  92. Taylor, D.R.; Weaver, J.A. Tumor pseudoprogression of spinal metastasis after radiosurgery: A novel concept and case reports. J. Neurosurg. Spine 2015, 22, 534–539.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : , , , , , , , , , , , , , ,
View Times: 586
Revisions: 2 times (View History)
Update Date: 29 Sep 2022
1000/1000
ScholarVision Creations