Anticoagulant drugs have been used to prevent and treat thrombosis. However, they are associated with risk of hemorrhage. Therefore, prior to their clinical use, it is important to assess the risk of bleeding and thrombosis. In case of older anticoagulant drugs like heparin and warfarin, dose adjustment is required owing to narrow therapeutic ranges. The established monitoring methods for heparin and warfarin are activated partial thromboplastin time (APTT)/anti-Xa assay and prothrombin time – international normalized ratio (PT-INR), respectively. Since 2008, new generation anticoagulant drugs, called direct oral anticoagulants (DOACs), have been widely prescribed to prevent and treat several thromboembolic diseases. Although the use of DOACs without routine monitoring and frequent dose adjustment has been shown to be safe and effective, there may be clinical circumstances in specific patients when measurement of the anticoagulant effects of DOACs is required. Recently, anticoagulation therapy has received attention when treating patients with coronavirus disease 2019 (COVID-19).
Thrombosis disorders require prompt treatment with anticoagulant drugs at therapeutic doses. Although these drugs are effective and useful for the prevention and treatment for thrombosis, the drugs are associated with the occurrence of hemorrhage, and therefore, the identification of patients at increased risk of bleeding is clinically important for selecting the optimal treatment and duration of anticoagulant therapy. Traditionally, heparin—consisting of unfractionated heparin (UFH), low molecular weight heparin (LMWH), and fondaparinux—is used widely. Heparin binds to antithrombin via its pentasaccharide, catalyzing the inactivation of thrombin and other clotting factors [1]. Warfarin, a vitamin K antagonist used to hamper γ-carboxylation of vitamin K-dependent coagulation factors, has also been used historically as an anticoagulant [1][2]. Drug monitoring in clinical laboratory tests is required when using these drugs. In recent years, direct oral anticoagulants (DOACs) have been developed, including direct factor IIa (i.e., dabigatran) and factor Xa inhibitors (i.e., rivaroxaban, apixaban, and edoxaban), which can overcome several of the limitations of warfarin treatment, such as food interactions and the need for frequent monitoring using clinical laboratory tests. DOACs are reported to have a superior safety profile compared with warfarin in some thrombosis disorders [3][4][5][6][7][8][9][10]. Nevertheless, even with the new generation of anticoagulant agents, the most relevant and frequent complication of anticoagulant treatment is major hemorrhage, which is associated with significant morbidity, mortality, and considerable costs [11][12][13]. Although routine monitoring of these drugs is not required, assessment of anticoagulant effect may be desirable in special situations and assessment of the individual bleeding risk might be relevant while considering the selection of the appropriate anticoagulant drug and treatment duration [14].
In addition to the introduction of new anticoagulant drugs, anticoagulation therapy has received attention with regard to treating patients suffering from coronavirus disease 2019 (COVID-19). Patients with COVID-19 associated pneumonia exhibit abnormal coagulation and organ dysfunction, and coagulopathy based on abnormal coagulation has been associated with a higher mortality rate [15][16]. Thus, a suitable treatment for the coagulopathy and thrombosis is required.
This review covers the anticoagulant mechanisms of heparin, warfarin, and DOACs as well as the measurement methods for these drugs. In addition, we attempt to provide the latest information on thrombosis mechanism in COVID-19 from the view of biological chemistry.
2.3. Direct Oral Anticoagulant (DOAC)
Dabigatran | Rivaroxaban | Apixaban | Edoxaban | Betrixaban | |
---|---|---|---|---|---|
Target | Thrombin | Factor Xa | Factor Xa | Factor Xa | Factor Xa |
Primary clearance | Renal | Renal | Fecal | Renal | Fecal |
Tmax | 1.5–3 h | 2–3 h | 3–4 h | 1–2 h | 3–4 h |
Half-life | 12–14 h | 5–13 h | 12 h | 10–14 h | 19–27 h |
The use of DOACs without routine monitoring and frequent dose adjustment has been shown to be safe and effective in the majority of patients, thus making them more convenient anticoagulants than warfarin [70]. However, it was reported that there would be clinical circumstances in specific patients when measurements for the anticoagulant effects of DOAC would be required. The following cases have been discussed for the utility of the measurements [62][71].
(i)
Persistent bleeding or thrombosis
(ii)
Decreased drug clearance resulting from impaired kidney function or liver disease
(iii)
Identification of subtherapeutic or supratherapeutic levels in patients taking other drugs that are known to significantly affect pharmacokinetics
(iv)
Extremely body weight: <40 kg or >120 kg
(v)
Perioperative management
(vi)
Reversal of anticoagulant
(vii)
Suspicion of overdosage
(viii)
Adherence to treatment protocol
The measurement assays have been developed to fulfill these requirements. It was considered that the optimal laboratory assays were dependent on the cases and the tests were classified as two types as follows: qualitative (presence or absence) and quantitative (ng/mL).
Quantitative Assay for DOAC Measurement
Liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS) is considered the gold standard method for the quantitative assay for DOAC measurement because it has high degree of specificity, sensitivity, selectivity, and reproducibility, and this method is often used in clinical development to evaluate DOAC pharmacokinetics [95][96][97]. The lower limits of detection (LoD) and quantitation (LoQ) in LC-MS/MS for DOAC detection has been reported as 0.025–3 ng/mL and the reportable range of quantitation has also been reported as 5–500 ng/mL [98][99][100][101][102][103]. Although LC-MS/MS is considered the most accurate method for DOAC measurements, it is not widely used because of the limitation of availability and complexity associated with this testing. In addition, LC-MS/MS has a high intra- and inter-assay coefficient of variation and it needs to be calibrated with the purified drug, which contributes to series-to-series variations. Thus, the drug-calibrated clot-based or chromogenic methods have been developed and adapted to automated coagulation analyzers.
One of the clot-based kits for dabigatran measurement is Hemoclot thrombin inhibitor (Hyphen BioMed, Neuville-sur-Oise, France). For detection, the diluted tested plasma is mixed with normal pooled plasma. Then, clotting is initiated by addition of human thrombin and the clotting time is measured in coagulation analyzers. The clotting time is directly related to the concentration of dabigatran, and the concentration is calculated from the calibration curve obtained from the dabigatran calibrator. It was shown that the clotting method and LC-MS/MS analysis correlated well, and the agreement was good [98]. Stangier et al. also reported that this kit enabled the accurate assessment of dabigatran concentrations within the therapeutic concentration range without inter-laboratory variability [104]. The correlation between the clotting method and qualitative tests as APTT and thrombin time has also been investigated. The APTT demonstrated a modest correlation with the dabigatran concentration measured by the clotting method but the correlation became less reliable at higher dabigatran levels and it was recognized that the sensitivity was different among APTT reagents. The TT was sensitive to the presence of dabigatran and showed >300 s in more than 60 ng/mL concentration. It was confirmed that the TT was too sensitive to quantify dabigatran levels [105]. The other study also demonstrated that APTT and TT could not identify the drug presence in too low or too high concentrations, respectively [106]. For dabigatran concentration detection, it is considered that the clotting method is a more suitable assay. In addition, there was an accurate correlation reported for ecarin clotting time or ecarin chromogenic assay and dabigatran concentrations [74]. Although the upper limit of the measurements is dependent on the ecarin unit and the composition, it was reported that these assays could detect dabigatran up to 940 ng/mL.
Chromogenic anti-Xa assay has been used in clinical laboratories for several decades as a method to assess heparin concentration. This assay is appropriate for measuring direct factor Xa inhibitor drug concentration. The chromogenic anti-Xa assay principle is a kinetics method based on the inhibition of factor Xa at a constant and limiting concentration by the tested factor Xa inhibitor. The samples are mixed with an excess amount of factor Xa, and factor Xa inhibitor binds to factor Xa. The remaining factor Xa is then measured by its amidolytic activity on a factor Xa specific chromogenic substrate, and the residual factor Xa cleaves the substrate. The amount of cleaved substrate is inversely proportional to the concentration of factor Xa inhibitor in the tested samples. The concentration is calculated from the calibration curve obtained from rivaroxaban, apixaban, and edoxaban calibrator, respectively. It has been proved that this kit has the correlation against LC-MS/MS, and it was also shown that the inter variability was small [100][107][108]. Douxfils et al. reported that the correlation between PT and LC-MS/MS was not linear and the sensitivity to drugs differed among reagents [100], it could be difficult to perform the standardization to estimate drug concentration in PT reagents. Biophen DiXaI is useful to quantify drug concentration because the kit has high sensitivity to drugs and adapted to automated coagulation analyzers. DOACs have short half-lives, their concentration is time-dependent, and the concentration of peak time is quite different from that of trough time. The information when patients take the drugs is useful for the interpretation of the drug concentration data.
This entry is adapted from the peer-reviewed paper 10.3390/biomedicines9030264