5. Human Leukocyte Antigen D-Related—HLA-DR Expression on Monocytes
Major histocompatibility complex (MHC) is a set of cell surface proteins crucial for recognition of foreign molecules by adaptive immune system. Human leukocyte antigen D-related (HLA-DR) is the MHC class II molecule expressed on most types of immune cells such as monocytes/macrophages, dendritic and B cells. HLA-DR expression correlates with immune cell activation and antigen presentation, a step that initiates the adaptive immune response. Conversely, a low level of HLA-DR expression is associated with an anti-inflammatory phenotype. In 1990, Hershman et al. first reported a decreased frequency in HLA-DR
+ monocyte soon following trauma in healthy individuals. There is a plethora of influences that preside over the control of HLA-DR expression on immune cells. Their expression is up- and downregulated by pro-inflammatory cytokines such as interferon-gamma (IFNγ) and anti-inflammatory cytokines such as IL-10, respectively. Medication such as corticosteroids and catecholamines are also able to reduce HLA-DR expression. Monocytic HLA-DR (mHLA-DR) expression is a pivotal link between innate and adaptive immunity; thus, the key interplay of monocytes with T cells is often colloquially referred to as “immunological synapsis”
[174][173]. The persistence and magnitude of mHLA-DR expression has been used as a global marker of immune function in critically ill patients since it was first proposed whereby, a low mHLA-DR serves as an indicator of monocyte anergy and is associated with lower tumor necrosis factor (TNF)-alpha and IL-1 production in response to bacterial insult
[175][174].
Monneret et al. conducted one of the landmark studies which attempted to describe mHLA-DR expression as a predictor of mortality in septic shock patients
[176][175]. The group explored whether a low mHLA-DR expression, as a biomarker of immunosuppression, is an independent predictor of mortality in 93 septic shock patients who survived the initial 48 h of septic shock. While mHLA-DR expression levels were not significantly different between survivors and nonsurvivors within the first 1–2 days, significant differences were observed at days 3–4 with increased percentage of HLA-DR positive monocyte in survivors (43%) as compared with nonsurvivors (18%). Multivariate logistic regression analysis showed that low mHLA-DR (<30%) at days 3–4 is an independent predictor of mortality in septic shock patients. The ROC curve demonstrated that 30% HLA-DR positive monocytes at days 3–4 is the best cut-off value for mortality prediction with an AUC of 0.76. Therefore, dynamic changes in mHLA-DR expression over time in the setting of sepsis are important in view of potential inter-individual variations.
Following that, the same group aimed to address whether low mHLA-DR expression was associated with an increased number of nosocomial infection (NI) after septic shock in 209 septic shock patients. mHLA-DR was measured at days 3–4 and 6–9 after the onset of shock, and patients were screened daily for the development of NI
[177][176]. mHLA-DR at days 3–4 was found to be diminished in nonsurvivors (20%) versus in survivors (43%), a similar result to previous studies. In line with these findings, the mHLA-DR value expressed as Means of Fluorescence Intensities (MFI) was 33 in nonsurvivors versus 67 in survivors. At days 3–4, patients who went on to develop NI had lower MFI values (39 versus 65 in those without NI). ROC curve analysis revealed that an MFI value of 54 was the best cut-off value to predict NI development with a sensitivity of 68% and specificity of 62%. At days 6–9, best cut-off MFI value was 57 with AUC of 0.64 (sensitivity 66%, specificity 60%). The study demonstrated that mHLA-DR ≤ 54 at days 3–4, and ≤ 57 at days 6–9 remained independently associated with NI occurrence after adjustment for clinical confounders. The study concluded that persistent low mHLA-DR expression was an independent predictor of secondary NI development in septic shock patients.
There is an emerging body of evidence that immune biomarkers are essential to guiding immunotherapy and risk stratification on an individual basis. Functional assessment of the immune system using mHLA-DR expression may reflect the net sum of pro- and anti-inflammatory factors and, therefore, the actual inflammatory phenotype and the phase of sepsis as such, this can be a better choice than using single pleiotropic and redundant inflammatory mediators
[178][177].
It has been suggested that utilization of a combination of several immune cell function markers provide benefit over interpretation of individual biomarkers alone in predicting risk for NI and outcome in critically ill patients. Conway Morris et al. demonstrated that a combination of three measures of immune cell function namely: neutrophil CD88, mHLA-DR expression and percentage of regulatory T cells were significantly predictive of susceptibility to developing NI
[179][178]. In their previous study they showed that critically ill patients have significant dysfunction of neutrophils from peripheral blood, mediated predominantly by activated complement (C5a)
[180][179]. A recent follow up study (INFECT study) has been completed by the same group, aimed at validating their results in a cohort of critically ill patients; in the setting of trauma, sepsis and post-surgical complications which all bear similarities in the innate and adaptive immune responses
[181][180]. This included a cohort of 138 patients. Reduced neutrophil CD88, reduced monocyte HLA-DR and elevated proportions of Tregs were all found to be associated with subsequent infection. The presence of immune dysfunction was linked to a commensurate increase in risk of infection, from 14% for patients with no dysfunction to 59% for patients with dysfunction of all three markers
[182][181]. This study demonstrated the feasibility of standardized flow cytometry from multiple sites
[183][182].
Sepsis-induced immunosuppression is global process, this can be seen both in the systemic circulation and in specific organs such as the spleen and lung. In a study investigating the immune status at the time of death, rapid post-mortem spleen and lung tissue harvest was performed at the bedsides of 40 patients who died of severe sepsis this was compared with control spleen and lung tissue. To identify potential mechanisms of immune dysfunction, cytokine secretion assays and immunophenotyping of cell surface receptor-ligand expression profiles were performed. Cytokine secretion in sepsis patients was found to be less than 10% of that in controls, independent of age, duration of sepsis, corticosteroid use and nutritional status. Immunohistological staining revealed extensive depletion of splenic CD4, CD8 and HLA-DR cells in sepsis patients as compared with controls. The study concluded that patients who die in ICU following sepsis have biochemical, flow cytometric and immunohistochemical findings consistent with immunosuppression as compared with patients who die of non-septic causes
[184][183].
In critically ill patients, it has been suggested that IAI is best assessed with multiple measurements of mHLA-DR expression over a duration of time rather than at a single time point. It has been shown previously that a persistent value of <8000 mHLA-DR molecules/cell for over two days is associated with increased risk for NI and mortality. Determination of the appropriate threshold levels of mHLA-DR is challenging given that there are several methods for measuring mHLA-DR expression. HLA-DR positive monocyte with a cut-off at 30% for detection of IAI is a non-standardized method. In a recent comparison of the conventional method with a standardized quantitative assay for mHLA-DR using measurement of bound HLA-DR antibodies per cell (mAb/cell) as a method of standardization, it was determined that the previously established cut-off value of 30% mHLA-DR corresponds to approximately 5000 mAb/cell, and 45% mHLA-DR to approximately 8000 mAb/cell
[174[173][184],
185], with the range between 30% and 45% mHLA-DR termed “borderline immunosuppression”. A cut-off value of 8000 mAb/cell has been used by authors in interventional clinical trials
[186][185].
In terms of outcome prediction, the prognostic value of utilizing mHLA-DR to predict mortality in 79 adult patients with severe sepsis has been investigated in a prospective observational study
[187][186]. mHLA-DR levels were measured on days 0, 3 and 7 following admission to the ICU. ΔmHLA-DR
3 and ΔmHLA-DR
7 (defined as the changes in mHLA-DR value on day 3 and day 7 respectively) was compared to the value of mHLA-DR obtained on day 0 of admission. The data for 28-day survivors and nonsurvivors were compared. The 28-day mortality in patients grouped by mHLA-DR expression with 30% as a cut-off value on days 0, 3 and 7 showed no significant difference between the groups suggesting that single measurements at these specific time points had little predictive value unless interpreted as part of a temporal trend. Additionally, it was shown that mHLA-DR levels return to normal in less than 7 days in injured patients who have an uneventful recovery, conversely it remains persistently decreased in patients who died or developed secondary infections. A dynamic view of mHLA-DR expression in critically ill septic patients shows that survivors tend to progressively normalize their levels of mHLA-DR
[188][187].
One study aimed to assess the persistence of sepsis-induced immunosuppression by measuring several markers, among them was mHLA-DR, at ICU discharge and 6 months after ICU discharge in patients admitted to the ICU for septic shock
[189][188]. The authors concluded that while immune alterations persist at the time of ICU discharge, there are no ongoing immune alterations in septic shock survivors 6 months later.
The value of temporal changes in mHLA-DR levels in the prediction of mortality has been further demonstrated in studying patients with severe acute pancreatitis (SAP). One group assessed the change in mHLA-DR on survival in SAP patients
[190][189]. Survivors were found to have upregulated mHLA-DR expression whereas in the late mortality group it was persistently downregulated. mHLA-DR expression on day 10 (HLA-DR10) gave the only statistically significant correlation with late mortality. ROC curve analysis confirmed that HLA-DR10 was a reliable predictor for late mortality with AUC of 0.944; The optimal cutoff value was 52.3% with a sensitivity of 94.4% and specificity of 85.7%. In another study of 64 patients with SAP, mHLA-DR expression was measured at admission and 7 and 14 days following the onset of SAP
[191][190]. The study demonstrated that patients with persistently low percentages of mHLA-DR throughout the observation period was more likely to develop sepsis in the clinical course subsequently. It was concluded that this was a reliable predictor of the development of sepsis in SAP patients.
Therefore, introduction of mHLA-DR measurement as a point-of-care test at the bedside in ICU may be beneficial for critically ill patients. An automated tabletop cytometer may be a suitable tool for ICU patients as well as for clinical trials as there is no need for sample preparation nor specific skills in flow cytometry and the results are obtained in less than 30 min
[192][191].
In addition to mHLA-DR expression, an alternative method of assessing immune status that has been extensively investigated involves detection of ex vivo lipopolysaccharide (LPS)-induced TNF-alpha production. This is a functional test of monocytic immune capacity. Recently, a comparison of mHLA-DR expression and ex vivo LPS-induced TNF-alpha production and their effect on 28-day outcome and development of secondary infections predictors in severe sepsis was performed in a prospective observational study of 83 adult septic patients
[193][192]. Blood samples were collected at three time points: days 1–2, 3–4 and 6–8 after the diagnosis of sepsis. The study showed that mHLA-DR expression was significantly reduced in nonsurvivors on days 3–4 and 6–8. Furthermore, median mHLA-DR expression decreased from days 1–2 to days 3–4 in patients who developed secondary infections while it was found to be increased in those who did not. This again suggested that changes in mHLA-DR expression over time rather than values at individual time points would be more useful for prediction of outcome. The study postulated that mHLA-DR expression may not be predictive at an early phase of sepsis because circulating monocytes are likely to be recruited out of the bloodstream to sites of active infection, thus resulting in an underestimation of the magnitude. Ex vivo LPS-induced TNF-alpha production did not differ between survivors and nonsurvivors nor between patients who developed secondary infection and those who did not. There was a statistically significant correlation between LPS induced TNF-alpha production and mHLA-DR expression. The group also noted that studies of LPS-induced TNF-alpha production to date primarily utilized pediatric populations; in light of the increasing recognition of the impact of immunosenescence to blunt host response to infection, it was suggested that the increased age and high incidence of co-morbidities may contribute to a labored TNF-alpha response. The study found mHLA-DR to be a more accurate predictor of mortality and secondary infections. In this particular study, the effect of diabetes mellitus, as a co-morbidity, on immune response in sepsis was not taken into account, it would be interesting to address this in future studies
[194][193].
There may be a link between immunosenescence and the consequent state of immune system that increases risk for a dysregulated inflammatory picture. Elderly patients are known to display enhanced apoptotic pathways that may contribute to the incidence of mortality due to sepsis
[195][194]. Evidence supporting this can be seen in a study of 73 critically ill patients in whom ex vivo LPS-induced TNF-alpha production was measured and found to be similar patients who did and those who did not develop an ICU-acquired infection
[196][195]. A study carried out a decade ago found differing results. The study recruited 19 septic trauma patients
[197][196]. On the day after the clinical diagnosis of sepsis, ex vivo LPS-induced TNF-alpha secretion was found to be significantly lower in nonsurvivors as compared with survivors of sepsis. The study concluded that ex vivo LPS-induced TNF-alpha production may be superior as an early predictor of clinical outcome in multiple trauma patients with sepsis when compared to mHLA-DR expression.
Another consideration to employing mHLA-DR measurements in an intensive care setting is the relative ease of running such a test
[198,199][197][198]. Future interventional studies aimed at the immune response during sepsis might be able to combine a functional test with a phenotypic immunological biomarker for the purpose of target group selection based on biological plausibility and potential intervention effectiveness.
The validity of monocyte HLA-DR expression as a predictor of early mortality was explored in a recent study of 52 septic patients. Monocyte HLA-DR expression was found to be significantly lower in nonsurvivors at time of diagnosis as compared with survivors and served as an independent predictor of 28-day mortality following sepsis
[200][199].
Another recent study performed by Duggal et al. showed that CD14
+ve HLA-DR
dim/low monocytes were found to be diminished in patients with poorer outcomes in ICU
[201][200].
In bacterial sepsis, there has been evidence to suggest that there are different mechanisms of the clinical manifestations of Gram-positive and Gram-negative sepsis. Some microbial challenges may determine levels of mediators that damage the infecting microorganism and the host. For example, Lipoteichoic acid (LTA) of Gram-positive bacteria as well as lipopolysaccharide (LPS) of Gram-negative bacteria has been shown to elicit different response from the host
[202,203,204,205,206][201][202][203][204][205].
In the setting of trauma, the predictive potential of mHLA-DR in 80 trauma patients was explored in one prospective study
[207][206]. Daily measurements of mHLA-DR were performed during the first 4 days following trauma. The lowest expression of mHLA-DR was found to be on day 2. Patients who restored mHLA-DR expression at day 3 appeared to be protected from infections, and those who displayed persistently reduced expression of mHLA-DR appeared to be at greater risk of infection. The ratio of mHLA-DR expression between day 3 and day 2, at a value of below 1.2, was found to be independently associated with the development of sepsis. Early mHLA-DR monitoring may therefore provide information preceding infection, thus allowing targeted prophylaxis with antibiotic treatment. Another interesting study of trauma patients aimed to investigate the release of DAMPs in the early, prehospital, phase and its relationship with immunosuppression and NI
[89][88]. Blood was obtained from 166 adult trauma patients at the trauma scene, emergency room (ER) and serially afterward. Circulating levels of nuclear and mitochondrial DNA, and HSP70 were determined. Immunosuppression was assessed by qPCR analysis of HLA-DRA gene expression and ex vivo LPS-induced cytokine production. The study found that HLA-DRA expression was attenuated directly after trauma and did not recover during the follow-up period, whereas ex vivo cytokine production revealed an anti-inflammatory phenotype as early as at the point of the trauma scene, it was also shown to persist in the days following that. By the time of arrival at ER there was significantly reduced HLA-DR mRNA associated with increased levels of anti-inflammatory IL-10. This is in contrast with the prevailing theory that immune dysfunction follows trauma. The importance of immunosuppression after trauma was alluded to in the observation that an HLA-DR mRNA ratio between day 3 samples and samples obtained in the ER of <1 was associated with an increased rate of NI. Higher concentrations of nuclear DNA were also associated with infections. The study concluded that plasma levels of DAMPs are associated with immunosuppression that is apparent within minutes/hours of trauma, and this profound immunosuppression is associated with increased susceptibility to NI following trauma.
Another study sought to clarify the complex interplay of the immune response to severe trauma. Ten trauma patients with injury severity scores greater than 20 at days 1, 3 and 5 after injury were evaluated
[208][207]. The study found that circulating monocytes percentage significantly increased after injury, possibly due to enhanced cell proliferation. Ex vivo stimulated TNF-alpha production and percentage of circulating HLA-DR positive monocytes were significantly decreased in trauma patients compared with age- and gender-matched controls at all time points. These findings suggested that monocyte behavior was significantly influenced by trauma and may display suppressed antimicrobial function. Surprisingly, monocyte phagocytosis was found to be at baseline function and the oxidative burst was augmented suggesting preservation of their innate antimicrobial functions. The study used single-cell mass cytometry to characterize the phenotype and function of major innate and adaptive immune responses in trauma patients. This was another significant study that can potentially pave the way to individualized risk stratification based on deep immune profiling of critically ill patients
[209][208].
Major surgery can also lead to reduced mHLA-DR expression resulting in adverse outcome. In addition to surgical trauma, other causes of post-surgical immunosuppression may include intraoperative hypotension, increased perioperative release of corticosteroids or catecholamines, as well as the application of anesthetic drugs such as fentanyl. One retrospective randomized controlled trial analysis of 10 post-operatively immunosuppressed patients following esophageal or pancreatic resection demonstrated that innate immunity recovered earlier than acquired immunity during severe postoperative immunosuppression. Among other immune markers, mHLA-DR expression was measured pre-operatively up to day 5 after surgery, it was shown that mean mHLA-DR recovery time was on day 5 post-operation
[210][209].
Another study aimed to describe the immediate immune response to major gastro-intestinal surgery in patients over 45 years old with planned post-operative ICU stay. It was concluded that monocyte dysfunction and features of immune suppression occur frequently following major surgery, contributing to post-operative infection
[211][210].
Almansa et al. evaluated the use of procalcitonin (PCT) with gene expression levels of HLA-DRA to detect sepsis in 154 surgical patients. Multivariate and AUC/ROC analysis showed that the PCT/HLA-DRA ratio was superior to PCT for the purpose of detection of sepsis with AUC of 0.85. It was consequently concluded that combination of PCT with HLA-DRA holds promise as a mode for improving sepsis detection in surgical patients
[212][211].
From this discussion, it can be seen that monocytes play a critical role in the innate and adaptive immune systems, performing phagocytosis and orchestrating antigen presentation as well as cytokine production. Recent research has also shown that the MHC class II antigen presentation pathway in human monocytes differs by subset and is regulated by cytokines as such, there is much to be explored yet
[213][212]. Going forward, it can be envisaged that HLA-DR could form a significant part of any immune dysfunction score in the assessment of sepsis, trauma and other forms of critical illness
[214][213].
Recently, two important studies explored the feasibility of circulating and cell-surface immune biomarkers as predictors of infection in critically ill patients (CAPTAIN and ExPRESS study) drawing contrasting outcomes. The CAPTAIN study was conducted to assess the accuracy of circulating biomarkers to discriminate between sepsis and non-septic SIRS. A difference was shown in MFI HLA-DR on both CD14
+High and CD14
+Low monocytes between sepsis and non-septic SIRS patients (0.9 vs. 1.5,
p = 0.05; and 2.9 vs. 4.2,
p = 0.05 respectively). Additionally, there was statistically significant difference in CD64-Neutrophil-MFI between the two groups (2.6 vs. 1.2,
p = 0.01 respectively). It was shown that eight biomarkers had an area under the receiver operating curve (ROC-AUC) of over 0.6 with a 95% confidence interval over 0.5. LASSO regression analysis identified C-reactive protein (CRP) and HLA-DRA mRNA as being repeatedly associated with sepsis, and no model was found to perform better than CRP alone in this setting (ROC-AUC 0.76 (0.68–0.84)). It was therefore concluded that circulating biomarkers may not be useful in the detection infection at the early phase of sepsis in ICU patients
[215][214].
The ExPRESS-sepsis cohort study recruited patients presenting to emergency departments (EDs) with suspected acute infection and aimed to evaluate the reliability of leukocyte biomarkers as predictors of sepsis (Sequential Organ Failure Assessment score ≥ 2 at 24 h and/or 72 h following ED presentation). In this multicenter cohort study in four EDs and ICUs, flow cytometry was utilized and patients with suspected acute infection (Group 1) with two comparator cohorts: ICU patients with established sepsis (Group 2), and ED patients without infection or systemic inflammation but requiring hospitalization (Group 3) were compared, and 272, 59 and 75 patients were recruited to cohorts 1, 2 and 3, respectively. Of the 47 leukocyte biomarkers examined, 14 were found to be unreliable, and 17 failed to discriminate between the three cohorts. In group 1, eight neutrophil CD antigens, along with seven monocyte and a T cell lymphocyte antigen were analyzed for their ability to predict consequent sepsis in patients who were suspected of sepsis. Individually, only raised neutrophil PD-1 (OR 1.78 (95% CI 1.23–2.57);
p = 0.002), raised monocyte PD-1 (1.32 (1.03–1.70);
p = 0.03) or reduced monocyte HLA-DR (0.73 (0.55–0.97);
p = 0.03) expression were associated with subsequent sepsis. From a large panel of leukocyte biomarkers, markers of early immune suppression (neutrophil and monocyte PD-1 and PD-L1; monocyte HLA-DR) had the strongest association with clinical outcomes. Increased neutrophil PD-1 and reduced monocyte HLA-DR expression were associated with deterioration to sepsis, suggesting that immune suppression may be an early event, prior to development of sepsis
[216][215].
Myeloid cell responses in sepsis are intertwined and complex. One example relates to the plasticity of these cells, which allows immature neutrophils to undergo differentiation to become monocytic cells
[217][216]. Following sepsis, decreased major histocompatibility complex (MHC) mRNA expressions of class II-related genes have been reported; in one study, mRNA expression of five MHC class II-related genes (CD74, HLA-DRA, HLA-DMB, HLA-DMA, CIITA) were measured by quantitative reverse transcription (qRT)-PCR and monocyte human leukocyte antigen-DR (mHLA-DR) by flow cytometry in septic shock patients
[218][217]. The authors reported that the best prognostic value regarding lethal outcome was obtained for CD74 (HLA-DR antigen-associated invariant chain). They concluded that decreased CD74 mRNA expression significantly predicted 28-day mortality following septic shock. Expression of the MHC class II-related genes HLA-DRA and CD74 was investigated in patients with complicated and uncomplicated
Staphylococcus aureus bacteremia (SAB)
[219][218]. The complicated SAB group included patients with hematogenous seeding or extension of infection beyond the primary focus, etc. It was reported that patients with complicated SAB show weaker HLA-DRA expression than those with uncomplicated SAB during the first week of bacteremia.
In a different study, HLA-DR expression on monocyte subsets was investigated in critically ill children
[220][219]. This population was compared with healthy children, and it was found that HLA-DR expression significantly decreased within all monocyte subsets, being most manifest on classical monocytes and in patients with sepsis. They concluded that low HLA-DR expression on classical monocytes was associated with NI and lethal outcome. Immune responses were investigated in another specific group of non-neutropenic patients with abdominal sepsis, with a focus on prospective invasive candidiasis (IC) risk prediction based on immune markers, including HLA-DR
[221][220]. The authors found that HLA-DR expression, over the first five days, showed no relevant difference between three groups of patients: with no colonization or IC, with subsequent colonization and with subsequent IC.
Various aspects of monocyte signaling can be assessed as potential sepsis immune markers. A monocyte distribution width value greater than 20.0 U is effective for sepsis detection in the emergency department
[222][221]. Authors of a recent study focused on a novel type of RNA class that is naturally resistant to degradation by exonucleases, termed circular (circ)RNA
[223][222]. They explored patterns of circRNA expression in peripheral monocytes of critically ill patients with sepsis secondary to community-acquired pneumonia relative to healthy donors. The authors concluded that circRNAs were more abundant in immune cells of sepsis patients.
Immune response in context of different causative pathogens and sites of infection is seldomly researched. Our group has investigated various aspects of the immune response to different bacteria, origin of secondary sepsis and outcome
[12,203,204,224][12][202][203][223] for over a decade.