Convergence of empirical and theoretical literature to increase effectiveness of DMHIs
-
An integrated blueprint suggested eminent DMH platforms are needed to increase the effectiveness of DMHIs in self-guided and guided approaches
[46]. The lack of highly effective, evaluated DMH platforms is entrenched in the struggle to sustainably innovate. There are underlying quality, safety and usability issues stemming from the difficulty converging theoretical, data-driven/technological and empirical research, as well as to satisfy mental health care professionals’ and users’ HCI demands
[19,47,48][19][47][48]. The development of optimized patient-centric digital tools is not the problem. Rather, it is how long it takes mental health care professionals to adapt in using these tools. For example, DMHIs may assist the prevention of the sequalae of mental illness quickly and accurately through predictive systems that apply DMH platforms and AI-driven apps
[19,39,49,50][19][39][49][50]. A trial-and-error approach may be necessary to overhaul how codesign, behavior theories, and clinical evaluation are applied
[51]. There is also a need to confront the lagging human factors that limit the successful implementation of DMH platforms and effective industry standards.
2. Principal Findings of Empirical Literature
A slightly higher qualitative evidence base was found in comparison to quantitative studies although the difference was made up of mixed-methods studies. Overall, the studies mainly evaluated feasibility, usability, engagement, acceptability and effectiveness. Although feasibility was found for the use of DMH platforms and DMHIs in mental health care and suicide prevention, the results highlight the need to increase usability and engagement in addition to effectiveness and quality.
The main types of DMH platforms used in the 22 included empirical studies are categorized as integrated, guided, self-guided, integrated-multifunctional, multimodal, and direct to consumer tele-mental health. This contrasted with previous reviews which mostly reported off-the-shelf solutions through computers, mobile apps, text message, telephone, web, CD-ROM, and video for general population DMHIs for suicidal ideation and mental health co-morbidities
[20]. Other previous reviews focused on general mental health support
[10], in addition to self-guided digital tools for anxiety and depression in general populations
[31,32][31][32]. In line with the previous reports of variability in the applications of use, the empirical evidence suggests DMH platforms and DMHIs are used for a range of purposes, e.g., to treat loneliness and to aid suicide prevention.
The high number and frequent use of DMH tools
[9] is reflected in the evaluative evidence base on the use of DMH platforms and DMHIs. In line with the previous findings of Borghouts et al.
[10], there was heterogeneity found in the mostly preliminary evidence. These findings mainly focused on feasibility, usability, engagement, and acceptability rather than the effectiveness of each DMH platform or DMHI.
The most significant finding overall arose from the RCT for SilverCloud’s ICBT for anxiety and depression
[76][52]. This RCT proceeded a study that established efficacy with regards to ICBT for adults with depressive symptoms
[45]. The general lack of study follow-up in the domain has hindered the evaluation when considering there are more than 100 DMH programs for depressed and anxious adults
[11,12][11][12]. RCTs are considered the “gold standard” by which psychological interventions are evaluated and subsequently adopted into general clinical practice
[80][53]. However, there are some limitations of RCTs in developing treatment guidelines in terms of the pragmatic application from a sample to the individual patient. For example, the baseline characteristics of the RCT by Richards et al.
[76][52] reported that 70% of the sample were female, noting this is only slightly higher than program referral rate for females (65%). This incidental finding highlights the inherent difficulties in recruiting and engaging men in mental health research
[48].
The significant and preliminary evidence found in the empirical literature do not tell the whole story regarding efficacy and effectiveness. For example, a previous review reported the DMH platform MOST is safe and effective
[44]. However, it is not clearly stated what it is effective for. It appears from the qualitative study by Valentine et al.
[79][54] that young people supported blended care through Horyzons (a derivative of MOST). Although, further evaluative research is needed on efficacy, e.g., on the therapeutic alliance, clinical and social outcomes, cost-effectiveness, and engagement. There was also no significant effect on social functioning compared with treatment as usual as a primary outcome of the RCT with Horyzons
[58][55]. This RCT followed extensive design, implementation
[42], and augmentation of social connectedness and empowerment in youth first-episode psychosis
[43]. These examples highlight the need for the
current study which distinguished between evaluative research focused on the effectiveness of the DMH platform as well as the effectiveness of the DMHI applied on the DMH platform.
The previous body of knowledge noted the difference between rigorous evidence of efficacy in trials and outcomes that indicate a lack of real-world impact
[35]. The
current study supports this finding. Although, it may help to also clarify about efficacy and effectiveness to generally assist in the evaluation of DMH platforms and DMHIs. For example, Craig et al.
[61][56] evaluated the AFFIRM Online DMH platform and reported it brought about efficacy through working under ideal circumstances. However, the RCT for SilverCloud’s ICBT for anxiety and depression
[76][52] was deemed to be more significant in evidence because it applied a waiting list to demonstrate pragmatic effectiveness by working in substandard circumstances. Evaluation of DMHIs may produce relevant, measurable, responsive, and resourced indications on safety or effectiveness for its intended mental health care and/or suicide prevention purpose. RCTs can bolster these claims by providing randomization which decreases bias and offers a rigorous tool to examine cause-effect relationships between an intervention and outcome. However, a successful RCT may not be required to demonstrate safety and effectiveness.
3. Secondary Findings of Empirical Literature
Robust stakeholder engagement is required to ensure there is responsiveness to needs and to gain support for DMH implementation. The previous review noted the existence of targeted strategies to serve young people in mental health care
[13,42][13][42]. Although the evidence synthesis found more of a focus on adults, there was a slightly higher number of targeted strategies for young people. However, there is a need for more effective qualitative strategies such as in designing and implementing youth-oriented tailored solutions
[68][57] and implementing a centralized DMH platform to improve stakeholder accessibility
[74][58]. The previous review of Spadaro et al.
[51] suggested overhauling the application of codesign, behavior theories, and clinical evaluation. In line, a qualitative study that evaluated DMHIs on the Innowell DMH platform articulated some implementation problems: restricted access, siloed services, interventions that are poorly matched to service users’ needs, underuse of personal outcome monitoring to track progress, exclusion of family and carers, and suboptimal experiences of care
[75][59]. A consequential evaluation of the Innowell DMH platform led to the finding that national scalability is hindered by human factors—the main problem is not the technology but the humans that implement and use it
[63][60]. This is in line with previous findings about the constraints in instructing the recipients of technologies
[5] and transforming clinicians’ strong interest in using technology to actual use
[4].
A previous review found human-centered design is important for the codesign process to instill an understanding of how DMH platforms can be used with engaging effectiveness
[47]. However, human-centered design is often not implemented well in DMH services. The evidence for HCI issues was in line; for example, the relationship between the UI of a DMH platform and treatment effectiveness was unclear
[73][61]. Furthermore, the results indicate that young people who perceived DMH platforms as useful in blended care were more willing to use the system in the future
[69,79][54][62]. The results with the Innowell DMH platform suggested that codesign is not a foolproof method to increasing effectiveness with DMH platforms
[63][60]. Previous findings on the need for key stakeholder and user input
[3] were echoed in addition to the call for funding and resources to expand regional case studies to the state level and beyond.