The Significance of Biomarkers in Neurodegenerative Disease Pathology: Comparison
Please note this is a comparison between Version 1 by Razvan Adrian Covache-Busuioc and Version 2 by Lindsay Dong.

With the inexorable aging of the global populace, neurodegenerative diseases (NDs) like Alzheimer’s disease (AD), Parkinson’s disease (PD), and amyotrophic lateral sclerosis (ALS) pose escalating challenges, which are underscored by their socioeconomic repercussions. A pivotal aspect in addressing these challenges lies in the elucidation and application of biomarkers for timely diagnosis, vigilant monitoring, and effective treatment modalities.

  • neurodegenerative diseases
  • biomarkers
  • aging

1. Introduction

Addressing the economic repercussions due to the upsurge in Alzheimer’s disease is paramount. A proactive approach involving early detection and intervention is vital not only to mitigate the prevalence of Alzheimer’s disease (AD) but also to elevate the life quality of both the affected individuals and their caregivers. The institution of robust social support mechanisms is integral to this strategy. Non-pharmacological measures emerge as the most preferred modalities in both the prevention and management of AD [1][6]. There is a prevailing hypothesis linking socioeconomic standing to AD, although the underlying cause for this association has not been unequivocally elucidated by prior research. A study by Wang et al. employed Mendelian randomization to delve into the potential influence of socioeconomic strata on AD and probed if elevated income exerted a protective effect against the disease’s onset [2][7]. From a health economics perspective, evaluations bifurcate into comparative analyses, assessing the cost–benefit ratio of varied therapeutic avenues, and cost-of-illness (COI) evaluations, which ascertain the economic strain of an ailment from a defined standpoint. Parkinson’s disease has been the subject of numerous COI studies across diverse global regions [3][8]. The protracted nature of PD, characterized by escalating disability and increasing dependence in activities of daily living (ADLs), imposes a substantial socioeconomic load. Advanced stages necessitate specialized institutional care, entailing significant resources and expenditures. Moreover, the familial impact of PD is profound with most ADL-dependent patients relying on home-based care provided by family members [4][9]. A subsequent exploration hypothesizes that there is a potential correlation between ALS risk and dietary habits, specifically the frequent intake of expensive, high-trophic level fish species known for elevated mercury content. This led to a detailed examination of the interplay between ALS, socioeconomic status, and mercury exposure via fish consumption [5][10]

The discernment of consistent biomarkers holds promise in advancing the early detection of neurodegenerative diseases, paving the way for the initiation of tailored therapeutic regimens. At present, the realm of epigenetics lacks robust and dependable biomarkers conducive to the diagnosis, categorization, or tracking of neurodegenerative disease progression [6][12]. In the context of available diagnostic modalities for neurodegenerative ailments, while pathological evaluations are held in high esteem across diverse afflictions, their applicability is limited in discerning neurodegenerative diseases during a patient’s lifetime. Alternatives like positron emission tomography (PET) scans or emergent biomarkers (encompassing genomics and proteomics) present potential breakthroughs and are being integrated into refined diagnostic parameters [7][13]. However, it is noteworthy that parameters such as DNA methylation levels, SIRT activity, and BDNF expression witness a marked decline in individuals diagnosed with dementia or Parkinson’s disease. Hence, the concurrent assessment of these epibiomarkers might enhance the diagnostic accuracy for neurodegenerative diseases. Given the reversibility of epigenetic alterations, gauging parameters like DNA methylation levels, SIRT activity, and BDNF expression could equip medical practitioners with insights to evaluate the efficacy of therapeutic interventions [8][14].

2. The Significance of Biomarkers in Neurodegenerative Diseases

2.1. What Are Biomarkers and Why Are They Important?

The concept of a “biomarker”, derived from the amalgamation of “biological” and “marker”, encompasses a broad range of medical signs. These signs provide objective evidence of a patient’s health condition and can be consistently and accurately quantified. This is distinct from medical symptoms, which are subjective sensations or complaints reported by the patient [9][15]. Biomarkers serve as pivotal tools in the methodical evolution of pharmaceuticals and medical apparatuses [10][16]. Yet, despite their immense significance, there exists a pronounced ambiguity surrounding their foundational definitions and the intricacies of their application in both research and clinical settings [11][17]. The spectrum of biomarkers ranges from elementary metrics like pulse and blood pressure to intricate laboratory assessments of blood and other biological specimens. Historically, medical signs have always been integral to clinical practice, with biomarkers representing the pinnacle of objective and quantifiable indicators that contemporary lab sciences can consistently measure. In the realm of drug innovation and broader biomedical investigations, biomarkers hold a transformative role. Deciphering the interplay between quantifiable biological mechanisms and clinical results is paramount for bolstering our repertoire of disease interventions and for a profound comprehension of standard physiological processes [12][18]. For biomarkers to be genuinely efficacious as replacements for clinically relevant endpoints, there is a prerequisite to thoroughly grasp the standard biological mechanisms, the alterations in disease conditions, and the impacts of varied interventions, be they drug-induced, device-based, or other [9][15]. The imperative for the prompt and precise identification of neurodegenerative conditions in clinical environments cannot be overstated. Beyond furnishing diagnostic and future insights, this need also encompasses the fine tuning of therapeutic approaches, ensuring apt care and support, and offering patients avenues to participate in clinical therapeutic studies [13][19].

2.2. Differentiating between Risk, Prodromal, Clinical, Wet, Dry Markers and Surrogate Endpoints

The methodology of risk assessment finds its application across diverse clinical spheres and for a variety of clinical outcomes. Regardless of the specific clinical domain or outcome in question, the foundational principles and techniques for evaluating risk markers and risk assessment remain consistent. Risk is typically gauged by counting the number of outcome incidents over a specified time span. This is traditionally encapsulated either via a survival curve or by denoting the fraction of incidents within a designated time frame, such as 30 days or a year [14][20]. As a result, there is often a strong interrelation among multiple biomarkers, complicating the process of pinpointing a singular prominent marker. Within the field of periodontology, the quest for risk biomarkers that can predict potential disease onset in individuals devoid of clinical symptoms is ongoing [15][21]. For Parkinson’s disease in its prodromal phase, while markers can facilitate diagnosis, it is imperative to understand four central characteristics of these markers, especially if they are to guide the selection of neuroprotective treatments. Among these, understanding the specificity or predictive accuracy of the marker is crucial, given the notable variances in specificity and positive predictive value (PPV) among different prodromal markers [16][22]. In this context, a “wet biomarker” is delineated as a prospective biomarker that can be objectively ascertained within a body fluid [17][23]. Biomarkers have been categorized into two main types: “dry” markers, which encompass imaging parameters, and “wet” markers, which refer to genetic and biochemical elements detectable in fluids such as blood, serum, urine, and tissue samples [18][24]. There are also surrogate markers (or surrogate endpoint), which are markers that are used as a distant relationship between an action and a clinical endpoint. An example of this would be the easy-to-understand relationship between smoking and lung cancer [19][25]. A surrogate endpoint of smoking would be death. Therefore, smoking is a surrogate marker of death via lung cancer. The utility of these endpoints would be of great value because it would clarify more easily the barrier between the general population and disease. It is a challenging task to pick surrogate endpoints and demonstrate their efficacy, because this action requires an extraordinary understanding of the disease’s pathophysiology. Several studies in hathe current literature have clarified the important yet difficult task to create these surrogate endpoints, and they have demonstrate the failure of this viewpoint in numerous studies, including neurodegenerative disease [20][21][22][23][24][26,27,28,29,30].

2.3. Overview of Their Roles in Early Diagnosis, Monitoring Disease Progression, and Evaluating Therapeutic Efficacy

At present, the categorization of most biomarkers hinges on the pathogenic processes they signify. For conditions like Alzheimer’s disease and frontotemporal lobar degeneration (FTLD) spectrum, the primary focus is on biomarkers indicative of pathology, such as those for amyloid-β (Aβ) and tau pathologies. These biomarkers are predominantly evaluated through CSF examinations, blood tests, and positron emission tomography scans [25][31]. In the preclinical stages of AD, while there are detectable biomarkers signaling brain alterations, clinical manifestations remain absent [26][32]. Conversely, in Parkinson’s disease (PD), the onset of classic motor symptoms is observed only after a significant proportion, over half, of neurons in the substantia nigra (SN) have already degenerated [27][33]. Consequently, pinpointing these conditions early is imperative for implementing strategies geared toward preventing neuronal loss. Over recent years, there has been a concerted effort by researchers to bolster the advancement of reliable biomarkers for neurodegenerative ailments. Despite these endeavors, results have often been inconsistent and not always meeting optimal standards. The trajectory of medical practice is increasingly leaning toward precision medicine, underscoring the pressing need to seamlessly incorporate disease-specific biomarkers in clinical routines and to engineer potent disease-altering treatments [25][31]. A double approach regarding neurodegenerative disease could be, firstly, neuroinflammation, which is a key factor that is both result and cause of neurodegeneration [28][34]. Secondly, in the last decade, research has pinpointed another key factor of neurodegeneration: cIMT (carotid intima media thickness). cIMT has been long debated as a surrogate endpoint of neurodegenerative disease; however, nowadays, it is a relevant influence in neurodegenerative disease [29][30][31][35,36,37].

2.4. Overview of Biomarkers in Huntington’s Disease, Multiple Sclerosis, Frontotemporal Dementia and Essential Tremor

2.4.1. Huntington’s Disease

Increasing emphasis has been placed on the significance of white matter in the degenerative process [32][38], as widespread alterations can be detected over a decade prior to anticipated disease onset [33][39]. A comprehensive study amalgamated clinical and morphometric imaging data from 1082 participants, sourced from the IMAGE-HD, TRACK-HD, and PREDICT-HD studies, with longitudinal observations spanning 1–10 years. The findings from this research indicate that imaging might be a viable endpoint in clinical trials due to its potential heightened sensitivity [34][40].
Regarding the wet biomarkers, a study indicates that mutant HTT levels exhibit correlations with clinical scores both cross-sectionally and in relation to CSF tau and neurofilament light chain (NfL) [35][41], both being indicators of neuronal damage [36][42]. This suggests that mHTT is likely released from compromised or deteriorating neurons. Given the pivotal role of mHTT in HD pathogenesis, it emerges as a salient potential biomarker. Not only is it the pathogenic agent in itself, but in the context of Huntington-lowering, it stands as a crucial gauge of pharmacodynamics, signifying whether the therapeutic agent has effectively engaged its target and manifested the anticipated immediate biological effect [37][43].
One study indicated that the accumulated data suggest a discernible segment of mHTT in the CSF is derived from striatal cells. These results advocate for the application of CSF mHTT as a PD biomarker in evaluating the engagement of therapeutic interventions tailored to decrease mHTT levels in the striatum [38][44]. A subsequent study explored the feasibility of utilizing noninvasive positron emission tomography (PET) for direct assessment of therapeutic efficacy and monitoring disease evolution in relation to mHTT. In this context, the novel radioligand [11C]CHDI-626 was characterized and examined longitudinally for mHTT PET imaging within the zQ175DN mouse model of HD. Notwithstanding its rapid metabolism and kinetics, the radioligand proved efficacious for mHTT PET imaging [39][45]. Liu et al.’s study furnishes initial evidence indicating that the early introduction of HTT-lowering treatment, prior to the manifestation of motor symptoms and striatal atrophy, can defer the onset and decelerate the progression of pathology and phenotype in a mouse model expressing full-length mHTT [40][46]. Concurrently, the research findings posit that the observed alteration in CBVa in premanifest zQ175 mice is a subsequent effect stemming from the influence of mHTT on neural activity/metabolism. Furthermore, the study suggests that a diminished rate of oxygen/nutrient delivery, attributed to a reduced cerebral blood volume and a decline in glucose transporter GLUT1 across a jeopardized neurovascular network during the manifest stage, may eventually instigate neuronal dysfunction and degeneration [40][46].

2.4.2. Multiple Sclerosis

In multiple sclerosis (MS), magnetic resonance imaging (MRI) elucidates the dimensions, quantity, chronology, and evolution of lesions within the central nervous system (CNS). Consequently, MRI is integral to the diagnostic process and therapeutic surveillance [41][42][43][47,48,49]. A study by Huang et al. demonstrates an up-regulation of MIP-1a and CXCL10 in the cerebrospinal fluid (CSF) of patients diagnosed with multiple sclerosis. Collectively, these cytokine biomarkers serve as a significant indicator of T cell activity, offering a measure that is both independent and complementary to the previously documented CXCL13, which is a chemokine targeting B lymphocytes [44][50]. To date, the singular cerebrospinal fluid (CSF) biomarker of clinical significance for MS is the presence of immunoglobulin G (IgG) oligoclonal bands (OCBs). These OCBs signify the intrathecal production of IgG, acting as a broader indicator of adaptive immunity activation within the CNS. It is pertinent to note that OCBs are not exclusive to MS; they have been identified in various inflammatory neurological disorders. Additionally, approximately 5% of MS instances do not exhibit CSF OCBs based on conventional assays [45][46][47][48][49][51,52,53,54,55]. Blood-based serum neurofilament light chain (sNfL) is a potential and easily accessible prognostic and treatment response biomarker for patients diagnosed with multiple sclerosis. It is important to note that without the inclusion of supplementary clinical context, sNfL on its own does not suffice for diagnosing multiple sclerosis or distinguishing it from other neuroinflammatory conditions characterized by neuroaxonal damage and elevated sNfL levels, such as neuromyelitis optica spectrum disorders or myelin oligodendrocyte glycoprotein (MOG) encephalomyelitis [50][51][52][53][56,57,58,59].

2.4.3. Frontotemporal Dementia

Over the past decade, neurofilament light chain (NfL) has garnered attention as a potential biomarker for FTLD due to its sensitivity in detecting neurodegeneration. Moreover, its levels demonstrate a correlation with the pace of clinical progression, providing prognostic insights. Recent scholarly investigations underscore the utility of NfL as a discriminative biomarker between bvFTD and primary psychiatric disorders, exhibiting areas under the curve ranging from 0.84 to 0.94 [54][55][56][57][58][60,61,62,63,64]. Progranulin (GRN) can be quantified in both blood and CSF, although the preponderance of research has been conducted on blood samples. Preliminary investigations reported remarkable sensitivity and specificity (both exceeding 95%) with a threshold of 61.5 ng/mL (ascertained in plasma using the Adipogen assay). However, subsequent research has proposed an elevated threshold of 71.0 ng/mL, boasting a sensitivity of 98.1% and specificity of 98.5%. It is posited that these levels are diminished from birth, as they appear to be low even when first assessed during late adolescence. Furthermore, these levels manifest consistent stability over extended periods, remaining relatively unaltered for up to four years as evidenced in one study [59][60][61][65,66,67].

2.4.4. Essential Tremor

In a forward-looking study that distinguished between sporadic-ET and hereditary-ET cases, the levels of uric acid were juxtaposed with those of controls. The results did not indicate significant deviations, thereby not affirming a neuroprotective function of uric acid in ET. Nonetheless, it is noteworthy that a correlation emerged between reduced uric acid levels and a later age of onset in sporadic cases, suggesting its potential significance as an indicator of neurodegeneration in such patients [62][68]. A study by Wang et al. introduced a methodologically sound consensus-based approach to scrutinize cerebellar involvement in ET, leveraging an augmented cohort for enhanced statistical power and taking into account the implications of MRI processing pipelines and statistical frameworks. This examination did not identify cerebellar involvement for advanced ET when synthesizing findings from three MRI biomarkers: voxel-based morphometry, cerebellar gray matter and white matter volumetry, and cerebellar lobular volumetry. The hypothesis was further assessed using ten prevalent statistical models based on biomarkers from Freesurfer, SUIT, and MAGeT. Notably, no cerebellar ROI derived from these three pipelines exhibited a consistent significant discrepancy [63][69]. Another study performed by Yu et al. revealed that erythrocytic total and aggregated α-syn concentrations were significantly elevated in PD and ET patients in comparison to HCs. Notably, erythrocytic total α-syn levels were observed to be markedly higher in the ET cohort than in the PD group. Additionally, the ratios of erythrocytic aggregated to total α-syn levels in the ET group were discernibly reduced relative to those in the PD and HC groups. A significant correlation was also identified between erythrocytic aggregated α-syn levels and the disease duration in ET patients [64][70].
Video Production Service