Next Article in Journal
A Framework for Quantifying the Strength of Partnerships between Agricultural Cooperatives and Development Actors: A Case Study in Saudi Arabia
Next Article in Special Issue
Evolving Adult ADHD Care: Preparatory Evaluation of a Prototype Digital Service Model Innovation for ADHD Care
Previous Article in Journal
Spatially Non-Stationary Response of Carbon Emissions to Urbanization in Han River Ecological Economic Belt, China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Evaluation of the Use of Digital Mental Health Platforms and Interventions: Scoping Review

Australian Institute for Suicide Research and Prevention, School of Applied Psychology, Griffith University, Messines Ridge Road, Mount Gravatt, QLD 4122, Australia
*
Author to whom correspondence should be addressed.
Int. J. Environ. Res. Public Health 2023, 20(1), 362; https://doi.org/10.3390/ijerph20010362
Submission received: 14 November 2022 / Revised: 22 December 2022 / Accepted: 23 December 2022 / Published: 26 December 2022
(This article belongs to the Special Issue Digital Mental Health: Changes, Challenges and Success Strategies)

Abstract

:
Background: The increasing use of digital mental health (DMH) platforms and digital mental health interventions (DMHIs) is hindered by uncertainty over effectiveness, quality and usability. There is a need to identify the types of available evidence in this domain. Aim: This study is a scoping review identifying evaluation of the (1) DMH platform/s used; and (2) DMHI/s applied on the DMH platform/s. Methods: The Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) guided the review process. Empirical studies that focused on evaluation of the use and application of DMH platforms were included from journal articles (published 2012–2022). A literature search was conducted using four electronic databases (Scopus, ScienceDirect, Sage and ACM Digital Library) and two search engines (PubMed and Google Scholar). Results: A total of 6874 nonduplicate records were identified, of which 144 were analyzed and 22 met the inclusion criteria. The review included general/unspecified mental health and/or suicidality indications (n = 9, 40.9%), followed by depression (n = 5, 22.7%), psychosis (n = 3, 13.6%), anxiety and depression (n = 2, 9.1%), as well as anxiety, depression and suicidality (n = 1, 4.5%), loneliness (n = 1, 4.5%), and addiction (n = 1, 4.5%). There were 11 qualitative studies (50%), 8 quantitative studies (36.4%), and 3 mixed-methods studies (n = 3, 13.6%). The results contained 11 studies that evaluated the DMH platform/s and 11 studies that evaluated the DMHI/s. The studies focused on feasibility, usability, engagement, acceptability and effectiveness. There was a small amount of significant evidence (1 in each 11), notably the (cost-)effectiveness of a DMHI with significant long-term impact on anxiety and depression in adults. Conclusion: The empirical research demonstrates the feasibility of DMH platforms and DMHIs. To date, there is mostly heterogeneous, preliminary evidence for their effectiveness, quality and usability. However, a scalable DMHI reported effectiveness in treating adults’ anxiety and depression. The scope of effectiveness may be widened through targeted strategies, for example by engaging independent young people.

1. Introduction

1.1. Background

Mental illness and suicide are ongoing primary global health problems [1] that need accessible and scalable solutions. For example, digital mental health (DMH), which is a contemporary method of mental health care that is distinguished by the large-scale integration of telehealth [2], apps [3,4], and digital platforms [5] as well as the promise of big data, genomics and artificial intelligence (AI) [6]. DMH platforms are a key technology for the purpose of assessment, support, prevention, and treatment in mental health [7]. Generally, digital platforms are an online space to exchange products, services, and information. The DMH global market is predicted to grow from USD 2568.6 million in 2021 to USD 18,717.5 million by 2030, at a compound annual growth rate of 21.1% [8]. An overview of systematic reviews summarized the research on the effectiveness of technology in DMH and found an extensive amount of DMH interventions (DMHIs) to address gaps in mental health service provision, in addition to shifting focus and target populations [9]. A hindering issue for the advancement of DMH is the sustained engagement of service users [10]. Therefore, it is important to provide a systematic approach to discern which DMH platforms and DMHIs are effective, usable and of good quality. Furthermore, it is necessary to clarify what mental health indications and populations these digital solutions are suitable for. To our knowledge, there is a lack of reviews that identify the types of available evidence on the use of DMH platforms.
The aim of this scoping review is to describe:
1)
Empirical studies that focused on evaluation of the DMH platform/s used; and
2)
Empirical studies that focused on evaluation of the DMHI/s applied on the DMH platform/s.
To this end, we first provide an overview of existing work on (1) The use and functionality of DMH platforms, (2) Effectiveness of and engagement with DMHIs, (3) Implementation barriers for DMH platforms, (4) Recommendations for overcoming implementation barriers, (5) Evaluative research for the use of DMH platforms and DMHIs, and (6) Convergence of empirical and theoretical literature to increase effectiveness of DMHIs.

1.2. Overview of Existing Work

1)
The use and functionality of DMH platforms
Digital platforms are used in various contexts in DMH (see Appendix A.1 for definitions). For example, DMH platforms are used in more than 100 services for adults with anxiety and depression [11,12]. There is a priority to establish evidence for use in servicing people with diagnosed mental disorders [5]. DMH platforms are also used to assist early intervention strategies for young people. For example, to assist practitioners to deliver quality, personalized and measurement-based care for young people’s overall health, mental health, everyday function, suicidal thoughts/behaviors and social connectedness [13]. The use of digital platforms for video chats, social networks, telephone calls, and emails as a means of communication are effective at the population level for anxiety and depression although screening and intervention, AI-driven technologies, social media and digital phenotyping are generally not effectively used in DMH [14].
Internet-delivered cognitive behavioral therapy (ICBT) is the most used DMHI. ICBT is widely accessible, efficient, (cost-)effective and adaptable [15,16]. Self-guided treatment (28.4%) and guided telehealth/peer-to-peer approaches (16.3%) are the most used DMH services followed by real-time AI diagnostic assessments in computational psychiatry (13.7%), consumer journaling and support signposting (10%), physical, augmented and virtual reality (6.8%), diagnostic support (6.3%), gamified digital treatments (5.3%), neurological interventions (4.7%), digital phenotyping (4.2%), and virtual assistants (4.2%) [17]. Suicide prevention standalone digital platforms are rare because they are usually combined with DMH platforms [18].
The different types and uses of DMH platforms means it is necessary to distinguish among them in terms of functionality, which is its usefulness, or how well it performs the designated job. For example, the futility of risk assessment in psychiatry means the functionality of AI based DMH platforms is dependent on it being combined with personalized mental health care [19].
2)
Effectiveness of and engagement with DMHIs
A systemic review found several efficacious, scalable and sustainable suicide prevention interventions providing the opportunity for population-level impact and strategies to enhance effectiveness and reach [20]. Psychiatric diseases contribute to 60–98% of suicides [21]. Suicide prevention DMHIs may help augment ongoing clinical care if practitioners exercise caution in recommending suitable interventions and are aware of the security of the data that is collected [18]. Although integrating DMHIs into psychiatric care shows promising results for real-time monitoring and feedback on changes in common symptoms (e.g., stress, anxiety, and depression) [22], caution needs to be exercised in making recommendations for interventions on distress and suicidality because of uncertainty about their effectiveness and evaluation [19].
Meta-analyses of randomized controlled trials (RCTs), an experimental form of impact evaluation with a randomly selected sample and control group from the same population, noted potential efficacy for DMHIs for anxiety [23,24] and depression [25] in general populations. It was suggested to focus studies on comparisons with face-to-face psychological care [23]. This focus may help extract which aspects of the technologies produce beneficial effects and for which populations [25]. It may also help focus more studies with routine care populations [24]. There is a good potential for DMH platforms to be used in applying affordable interventions and preventive treatments [26,27,28]. However, the consumer marketplace is currently inundated with apps that lack engagement and efficacy [5]. Systematic reviews found a lack of clear and comprehensive evidence-base although there is a growing consensus that the most effective DMHIs are used for anxiety and depression particularly with college students [29] and young people [30]. A systematic review reported DMHIs have higher sustained engagement than self-guided digital tools [10]. This finding was endorsed by meta-analyses centered on anxiety [31] and depression [32].
3)
Implementation barriers for DMH platforms
A range of barriers hinder effective and sustained implementation of DMH platforms. For example, the field is constrained by issues of affordability [33], accessibility, relevance, reliability, a lack of personalization and human capacity [12], technical and ethical considerations [34] as well as privacy and security, efficacy, engagement, and clinical integration [5]. There is rigorous evidence of efficacy in trials although a lack of real-world impact [35] means there is an inconsistent impact. This is because of difficulties in instructing patients and mental health care professionals in using DMH platforms as well as the regulatory context of health care delivery [5]. The promising results in support of DMH platforms may be hindered by the human factors of human–computer interaction (HCI) (e.g., organizational readiness and usability in the healthcare context) [14]. For example, there was a 500% increase in the use of tailored self-guided resources by healthcare workers during the COVID-19 pandemic, although most dropped out of treatment because of time constraints, privacy concerns, treatment relevancy and satisfaction with the digital health platform design and experience [36].
4)
Recommendations for overcoming implementation barriers
Different levels of DMH platform evaluation are required ranging from feasibility and pilot studies on user retention/acceptability, safety and satisfaction through to RCTs and implementation feasibility studies [37]. Apps need to be moved to an integrated digital platform, and digital tools need to be highly effective and engaging, address inequalities, and build trust in their authenticity [35]. There also needs to be better (cost-)effectiveness [38,39]. Furthermore, innovation is required to converge pattern-based and hypothesis-driven methods for evaluation of rigorous preventive strategies and interventions [5,19,40]. Codesign may help to strengthen the human-centered design process and instill an understanding of how an application achieves real-world effectiveness [14]. All the aspects surrounding innovation must be considered for the sustained use of DMH platforms. ‘Convergence mental health’ is recommended to facilitate access to and use of DMH services through integrating scientists, clinicians, bioinformaticists, global health experts, engineers, technology entrepreneurs, medical educators, caregivers, and patients as well as infusing synergy between government, academia, and industry for multidisciplinary applied and translational solutions [41].
5)
Evaluative research for the use of DMH platforms and DMHIs
There is a small amount of previous review and analysis on (1) evaluation of the use of DMH platforms and (2) evaluation of the use of DMHIs. As an example of 1, the DMH platform MOST was applied in evaluative research that highlighted the potential of novel multimodal approaches to help-seeking by connecting MOST with clinical services to provide support in real-time and to sustain mental health recovery for young people [42]. An earlier pilot study established the acceptability, safety and initial clinical benefits of the Horyzons DMH platform for peer-to-peer social networking, individually tailored interactive psychosocial interventions, and expert interdisciplinary and peer-moderation [43]. MOST was reported to be safe and effective for evidence-based mental health support for young people with psychoses, depression, social anxiety, mental illness and suicidal risk [44]. As an example of 2, an RCT study demonstrated the efficacy of an ICBT program—‘Space from Depression’—for adults with depressive symptoms [45].
6)
Convergence of empirical and theoretical literature to increase effectiveness of DMHIs
An integrated blueprint suggested eminent DMH platforms are needed to increase the effectiveness of DMHIs in self-guided and guided approaches [46]. The lack of highly effective, evaluated DMH platforms is entrenched in the struggle to sustainably innovate. There are underlying quality, safety and usability issues stemming from the difficulty converging theoretical, data-driven/technological and empirical research, as well as to satisfy mental health care professionals’ and users’ HCI demands [19,47,48]. The development of optimized patient-centric digital tools is not the problem. Rather, it is how long it takes mental health care professionals to adapt in using these tools. For example, DMHIs may assist the prevention of the sequalae of mental illness quickly and accurately through predictive systems that apply DMH platforms and AI-driven apps [19,39,49,50]. A trial-and-error approach may be necessary to overhaul how codesign, behavior theories, and clinical evaluation are applied [51]. There is also a need to confront the lagging human factors that limit the successful implementation of DMH platforms and effective industry standards.

2. Methods

2.1. Overview

A scoping review methodology was undertaken to summarize empirical studies that evaluated web-based, smartphone and cross-platform DMH platforms and DMHIs used in assessment, support, prevention, and treatment for all indications of mental health disorders as well as suicidality. The reason for focusing on all mental health disorders (i.e., schizophrenia; anxiety, bipolar, depressive, autism spectrum, attention deficit hyperactivity, conduct and other mental disorders; idiopathic developmental intellectual disability; and eating disorders) is because prevention and early intervention are important for decreasing the mental illness sequalae and new ways of assessment, support and treatment may be possible with DMH [19]. Suicidality is included because it may sometimes occur separate from a mental health disorder. The methods selection was guided by the purpose and framework of scoping reviews [52] and the description of the 6 different exemplars for scoping reviews [53]. Exemplars provide an ideal model to follow. We chose to follow the exemplar ‘to identify the types of available evidence’. Our review started with planning the review procedure and continued with a search process and practical screening of articles to identify evidence. We focused on evaluation of the DMH platform/s used in addition to evidence focused on evaluation of the DMHI/s applied on the DMH platform/s. Issues with generalizability and validity were mostly unknown because of a large body of evidence in the domain. The intention was to enable knowledge by clarifying the type of DMH platform used in the empirical study and in what context it was evaluated. We described the type of study/aim, its purpose, population, outcomes/form of evidence as well as the model/s of care applied. The aim and outcomes of the study were examined to determine the types of DMH platforms and DMHIs used.
Included studies were selected and assessed for compliance with predetermined inclusion criteria. These were described and illustrated according to a modified Preferred Reporting Items for Systematic Reviews and Meta-Analyses—extension for Scoping Reviews (PRISMA-ScR) [54]. This procedure identified the current position of the evidence in the domain by separating studies that evaluated the use of DMH platform/s from those that evaluated their use in DMHI/s. Studies that primarily used apps, AI-driven immersive/interactive/wearable technologies, social media and digital phenotyping for mental health care and/or suicide prevention were out of scope because digital platforms are the most used technologies in self-guided and guided approaches [17]. Therefore, digital platforms are the most likely technology to be associated with evidence. There are safety and quality concerns about apps because there are more than 10,000 available [55], and apps have a low rate of testing (30%) for individuals with clinical conditions [56].

2.2. Search Strategy

Based on the research aims, the search terminology “digital platform” AND “mental health care” OR “suicide prevention” was used on 7 April 2022 to search full text journal articles in 4 databases—Scopus, ScienceDirect, Sage, and the Association for Computing Machinery (ACM) Digital Library. The same search terminology was used in 2 search engines (PubMed and Google Scholar). A combination of other search terms was tested. These databases and search engines were tested for variance in searches of the following search terms “digital mental health”, “platform”, “multifunctional”, “mental health care”, “distress”, “suicide prevention”, “suicide behavior prediction”, “self-help”, “guided”, “digital interventions”, “depression”, “anxiety”, “suicide” and “wellbeing”. However, the results of various combinations of these search terms found no further relevant articles. Therefore, the other search terms were excluded, and 22 articles were deemed to be suitable for inclusion, underlining the narrow focus of the field.
The ACM database was selected to cover computing and information technology articles. PubMed was selected to include medical and psychology-related articles. Scopus, ScienceDirect, Sage and Google Scholar were chosen to include studies in multidisciplinary areas of interest including psychology, the social sciences, and hybrid studies that used digital platforms in the study. Article types included qualitative, quantitative and mixed-method studies published between 2012–2022. We distinguished between qualitative and quantitative studies (including clinical trials) from the following health-focused research methods definition by Denny and Weckesser. Qualitative study designs are characterized by aiming to provide insight and understanding of an individual’s experience in terms of thoughts and behaviors, whereas quantitative research aims to detail what happened, for example through applying randomized evaluations [57]. The title, abstract, keywords were screened. All articles were in English. The evidence-base was inferred to be mostly from within the previous 5 years although the inclusion criteria was increased to the prior 10 years to minimize selection bias. For example, the systematic overview on evidence for DMHIs for young people by Lehtimaki et al. [30] applied a prior 10-year period in the inclusion criteria although the 4 systematic reviews included were from the prior 4 years.
The inclusion and exclusion criteria and data extraction format were drafted by the first author (LB) and then reviewed and finalized in coordination with the co-author (DDL). The preliminary search process involved a screening of the search results carried out by the first author. Data extraction and full-text review were performed by the first author applying the inclusion and exclusion criteria. A quality appraisal and consultation with the co-author was applied to reduce bias and uncertainty and to create reliability and trust in the research. Ambiguities were reduced through discussion and consensus among the authors.

2.3. Inclusion and Exclusion Criteria

The inclusion and exclusion criteria (see Appendix A.2) informed the selection of studies. An article was kept if it met the inclusion criteria and was disqualified if it met any of the exclusion criteria.

2.4. Data Analysis and Synthesis

The first author extracted the data from the shortlisted articles, based on the research aims. The study design/aim, DMH platform (type, purpose of use, and population), outcomes/form of evidence and the approach/comparison were tabled to organize the evaluation. A table organized studies that focused on the DMH platform/s in addition to a table on the DMHI/s applied on the DMH platform/s. If these details were not clear, the DMH platform or DMHI was analyzed for what it was put in place for. Therefore, each study’s aim was compared with its outcomes to determine if it was primarily focused on evaluation of the DMH platform or the DMHIs applied on it. If the population details were not clearly stated, then the user recipients were extrapolated through interpretation. For example, it was inferred that the Swedish general population use the Swedish health care system. The DMH platforms were categorized according to our previous reviews that identified tele-mental health, online self-guided and/or online guided therapy, as well as multifunctional and/or integrated DMH platforms as the 5 main types reported [47,48]. Due to the heterogeneity of the included studies, a narrative synthesis was undertaken.

3. Results

3.1. Selection of Articles

In total, 6879 records were retrieved from databases and search engines including: 3346 (48.6%) from ScienceDirect (11 were assessed for eligibility and 11 were excluded); 1481 (21.5%) from PubMed (1 was assessed for eligibility and it was excluded); 1010 (14.7%) from Google Scholar (75 were assessed for eligibility—7 were included and 68 excluded); 804 (11.7%) from Sage (12 were assessed for eligibility—2 were included and 10 excluded); 145 (2.1%) from Scopus (25 studies were assessed for eligibility—9 were included and 16 excluded); 75 (1.1%) from ACM Digital Library (2 studies assessed for eligibility and 2 were excluded), and 18 (0.3%) records from additional sources (i.e., reference lists of included studies—18 studies assessed for eligibility—13 were excluded and 5 included).
Out of the 6879 records retrieved, 5 duplicates were removed. Therefore, 6874 records were screened by reading their title, abstracts and keywords. Full texts of 144 records (2.1%) were assessed for eligibility—22 (15.3%) empirical studies met the inclusion criteria and 122 (84.7%) were excluded. The reasons for exclusion were because the articles were assessed to be about digital platforms with no mental health care or suicide prevention outcomes, descriptions of DMH platform development with no outcomes, DMH platform trial descriptions with no results, and follow up articles with the same DMH platform. Studies were checked for follow-ups with the same digital solution—1 article was excluded on this basis—the most recent and better-quality findings were included. The selection process (see Figure 1) was based on a modified version of the Preferred Reporting Items for Systematic Reviews and Meta-Analyses—extension for Scoping Reviews (PRISMA-ScR) [54]. See Appendix B for the PRISMA-ScR Checklist.

3.2. Summary of Results

The scoping review findings are summarized in two overviews. Firstly, for the 11 empirical studies that focused on evaluation of the DMH platform/s used (see Table 1). Secondly, for the 11 empirical studies that focused on evaluation of the DMHI/s applied on the DMH platform/s (see Table 2).

3.2.1. Main Characteristics of the Included Studies

  • The studies were conducted in Australia (n = 10, 45.5%), Europe (n = 6, 27.2%) and North America (n = 6, 27.2%).
  • Most of the studies did not include specific age groups. It was inferred that 15 (68.2%) of the included studies were generally focused on adults and 7 (31.8%) of the included studies were focused on young people including children, adolescents, as well as college and university students aged 18–28.
  • Most of the studies addressed the use of DMH platforms for general/unspecified mental health and/or suicidality indications (n = 9, 40.9%), followed by depression (n = 5, 22.7%), psychosis (n = 3, 13.6%), anxiety and depression (n = 2, 9.1%), as well as anxiety, depression and suicidality (n = 1, 4.5%), loneliness (n = 1, 4.5%), and addiction (n = 1, 4.5%).
  • Targeted strategies were reported in 8/22 studies (36.4%) comprising of youth with psychosis (n = 3, 13.4%), depression and stress in LGBTQA+ youth (n = 1, 4.5%), secondary students with symptoms of anxiety and depression (n = 1, 4.5%), mothers with postpartum depression (n = 1, 4.5%), loneliness in adults (n = 1, 4.5%), and adults with addictions (n = 1, 4.5%).
  • The types of DMH platforms used were integrated (n = 5, 22.7%), integrated-multifunctional (n = 5, 22.7%), guided therapy (n = 5, 22.7%), self-guided and guided therapy (n = 3, 13.6%), multimodal (n = 1, 4.5%); self-guided (n = 1, 4.5%), direct to consumer tele-mental health (n = 1, 4.5%), and an unspecified range of existing DMH platforms (n = 1, 4.5%).
  • The studies were mostly investigated with a blended mental health care approach (n = 11, 50%). Some were combined with a comparison approach: blended mental health care and usual primary care (n = 2, 9.1%); blended mental health care and waitlist control (n = 2, 9.1%); blended mental health care and online self-guided (n = 1, 4.5%). Stepped mental health care approaches were less common and combined with comparisons where implemented: stepped mental health care and self-guided (n = 1, 4.5%) and stepped mental health care and waitlist control (n = 1, 4.5%). Other studies used self-guided approaches (n = 1, 4.5%) or self-guided and guided approaches (n = 3, 13.6%).
  • Overall, there were slightly more qualitative studies (n = 11, 50%) than quantitative studies (n = 8, 36.4%) including 4 RCTs, in addition to a few mixed-methods studies (n = 3, 13.6%).
  • Feasibility (n = 6, 27.25%) was the most common study type in addition to various combinations, i.e., feasibility and acceptability (n = 3, 13.6%); feasibility, acceptability and engagement (n = 2, 9.1%); feasibility, usability and engagement (n = 1, 4.5%); and feasibility, safety and acceptability (n = 1, 4.5%). The remainder of the study types included usability and engagement (n = 4, 18.2%); effectiveness (n = 2, 9.1%); effectiveness and usability (n = 1, 4.5%); acceptability (n = 1, 4.5%); and acceptability and engagement (n = 1, 4.5%).

3.2.2. Main Findings of the Included Studies

A review of the empirical literature found a small but promising amount of evidence for the use of DMH platforms and DMHIs in mental health care and suicide prevention. Overall, significant evidence was found in 2 of the 22 (9.1%) included studies. There was mostly preliminary evidence marked by 19 of the 22 (86.4%). No evidence was found in 1 of the 22 (4.5%). The 11 empirical studies that focused on evaluation of the DMH platform/s used were comprised of 1 study (9.1%) that found significant evidence, 9 studies (81.8%) that found preliminary evidence and 1 study (9.1%) that found no evidence. The 11 empirical studies that focused on evaluation of DMHI/s applied on the DMH platform/s were comprised of 1 study (9.1%) that found significant evidence and 10 studies (90.1%) that found preliminary evidence.
Empirical studies that focused on evaluation of the DMH platform/s used
One study provided significant evidence:
  • One quantitative study on feasibility and acceptability found efficacy in the affirmative CBT-based AFFIRM Online which used blended care to relieve depression and coping with stress in the LGBTQA+ youth community [61].
Nine studies contributed preliminary evidence:
  • One RCT on feasibility, acceptability and safety found Horyzons had no significant effect on social functioning compared with treatment as usual [58]. Although there was a significant correlation between the use of the DMH platform and perceived helpfulness for vocational and relapse prevention support.
  • One quantitative study on feasibility and acceptability found statistically significant support for 7Cups in treating postpartum depression [59]. However, there was no significant difference compared to treatment as usual.
  • One qualitative study on feasibility found Happify Health’s loneliness interventions may be effective in self-guided and guided approaches [60].
  • One RCT on feasibility found possible efficacy for the Swedish health care system DMH platform that applies ICBT for treating depression in routine psychiatric care [62]. Although findings are limited by the small sample size.
  • One qualitative study on feasibility found stakeholders supported the use of the Innowell DMH platform [63]. Although effective implementation is hindered by human factors.
  • One quantitative study on effectiveness found BetterHelp to be potentially effective for treating adult depression [64]. Although, it was noted that trials are needed.
  • One mixed-methods study on feasibility, acceptability and engagement found initial support for Smooth Sailing [65]. Although, effective engagement strategies are needed.
  • One qualitative study on usability and engagement found longitudinal studies are required to confirm Depression Connect is effective for sharing coping experience [66].
  • One mixed-methods study on feasibility found DMH platforms can assist evaluating youth wellbeing [68]. However, more effective qualitative strategies are required.
One study demonstrated no findings of evidence:
  • One qualitative study on acceptability and engagement found a lack of support for Virtual Coach because it was difficult to relate to and engage with [67].
Empirical studies that focused on evaluation of DMHI/s applied on the DMH platform/s
One study provided significant evidence:
  • One RCT on the effectiveness of the SilverCloud DMH platform’s ICBT in a stepped care approach reported (cost-)effectiveness with significant long-term impact on anxiety and depression in UK general population adults [76].
Ten studies contributed preliminary evidence:
  • One qualitative study on feasibility and acceptability found largely positive views on DMHIs for health care delivery [69]. However, concerns over privacy and data were noted.
  • One qualitative study on feasibility, usability and engagement reported user engagement and delivery of ICBT for depression could be improved by establishing, planning and promoting a working alliance in the user-practitioner relationship [70].
  • One qualitative study on usability and engagement found tele-mental health on DMH platforms may offer a range of important interpersonal interaction that presents benefits [71]. Although, there are hindering ethical complexities and structural challenges.
  • One qualitative study on feasibility found SMART Recovery could assist mutual support through meetings online [72]. However, these methods are not as well-suited to those with experience of in-person support.
  • One RCT on usability and engagement found an optimized UI based on UX contributed to increased usability and engagement in treatment with the Swedish health care system DMH platform [73]. Although, the relationship between UI and treatment effectiveness was unclear.
  • One qualitative study on feasibility found clinicians use digital tools with utility [74]. Although, a centralized DMH platform is required to improve stakeholder accessibility in addition to youth-oriented tailored solutions.
  • One mixed-methods study on feasibility, acceptability and engagement found consensus on the stakeholder benefits from DMHIs that use technology-enabled care coordination (TECC) [75]. However, implementation of the DMHIs is hindered by human factors.
  • One qualitative study on usability and engagement found appropriate language and presentation styles in a social media campaign and online support forum [77]. However, datasets are required to improve mental health communication.
  • One quantitative study on effectiveness, usability and engagement found a high level of engagement and a very high level of satisfaction and sustained overall improvement in psychological symptoms [78]. Although, the relatively small size of the registered sample prevented generalizability.
  • One qualitative study on acceptability found young people supported blended mental health care in an assistive capacity to traditional care although evaluative evidence is needed to determine the impact on the therapeutic alliance, clinical and social outcomes, cost-effectiveness, and engagement [79].
The most described services were mental health screening, online guided and online self-guided, tele-mental health, and integrated approaches. There were more studies on adults (68.2% compared to 31.8% for youth) although targeted strategies were more common for youth (62.5% compared to 37.5% for adults). Only a few studies focused on subpopulations, of which youth with psychosis was the most studied. However, there was efficacy found for AFFIRM Online which demonstrated a successful example of community based DMH platform for LGBTQA+ youth with stress and depression. Overall, an RCT with SilverCloud’s ICBT program was the most significant evidence to-date. Richards et al. noted the (cost-)effectiveness of a DMHI with significant long-term impact on anxiety and depression in the UK general adult population [76].

4. Discussion

4.1. Principal Findings of Empirical Literature

A slightly higher qualitative evidence base was found in comparison to quantitative studies although the difference was made up of mixed-methods studies. Overall, the studies mainly evaluated feasibility, usability, engagement, acceptability and effectiveness. Although feasibility was found for the use of DMH platforms and DMHIs in mental health care and suicide prevention, the results highlight the need to increase usability and engagement in addition to effectiveness and quality.
The main types of DMH platforms used in the 22 included empirical studies are categorized as integrated, guided, self-guided, integrated-multifunctional, multimodal, and direct to consumer tele-mental health. This contrasted with previous reviews which mostly reported off-the-shelf solutions through computers, mobile apps, text message, telephone, web, CD-ROM, and video for general population DMHIs for suicidal ideation and mental health co-morbidities [20]. Other previous reviews focused on general mental health support [10], in addition to self-guided digital tools for anxiety and depression in general populations [31,32]. In line with the previous reports of variability in the applications of use, the empirical evidence suggests DMH platforms and DMHIs are used for a range of purposes, e.g., to treat loneliness and to aid suicide prevention.
The high number and frequent use of DMH tools [9] is reflected in the evaluative evidence base on the use of DMH platforms and DMHIs. In line with the previous findings of Borghouts et al. [10], there was heterogeneity found in the mostly preliminary evidence. These findings mainly focused on feasibility, usability, engagement, and acceptability rather than the effectiveness of each DMH platform or DMHI.
The most significant finding overall arose from the RCT for SilverCloud’s ICBT for anxiety and depression [76]. This RCT proceeded a study that established efficacy with regards to ICBT for adults with depressive symptoms [45]. The general lack of study follow-up in the domain has hindered the evaluation when considering there are more than 100 DMH programs for depressed and anxious adults [11,12]. RCTs are considered the “gold standard” by which psychological interventions are evaluated and subsequently adopted into general clinical practice [80]. However, there are some limitations of RCTs in developing treatment guidelines in terms of the pragmatic application from a sample to the individual patient. For example, the baseline characteristics of the RCT by Richards et al. [76] reported that 70% of the sample were female, noting this is only slightly higher than program referral rate for females (65%). This incidental finding highlights the inherent difficulties in recruiting and engaging men in mental health research [48]. This limitation extends to identifying the underserved and the unserved in mental health care assessment and treatment [19].
The significant and preliminary evidence categories presented in the Results section do not tell the whole story regarding efficacy and effectiveness. For example, a previous review reported the DMH platform MOST is safe and effective [44]. However, it is not clearly stated what it is effective for. It appears from the qualitative study by Valentine et al. [79] that young people supported blended care through Horyzons (a derivative of MOST). Although, further evaluative research is needed on efficacy, e.g., on the therapeutic alliance, clinical and social outcomes, cost-effectiveness, and engagement. There was also no significant effect on social functioning compared with treatment as usual as a primary outcome of the RCT with Horyzons [58]. This RCT followed extensive design, implementation [42], and augmentation of social connectedness and empowerment in youth first-episode psychosis [43]. These examples highlight the need for the current study which distinguished between evaluative research focused on the effectiveness of the DMH platform as well as the effectiveness of the DMHI applied on the DMH platform.
The previous body of knowledge noted the difference between rigorous evidence of efficacy in trials and outcomes that indicate a lack of real-world impact [35]. The current study supports this finding. Although, it may help to also clarify about efficacy and effectiveness to generally assist in the evaluation of DMH platforms and DMHIs. For example, Craig et al. [61] evaluated the AFFIRM Online DMH platform and reported it brought about efficacy through working under ideal circumstances. However, the RCT for SilverCloud’s ICBT for anxiety and depression [76] was deemed to be more significant in evidence because it applied a waiting list to demonstrate pragmatic effectiveness by working in substandard circumstances. Evaluation of DMHIs may produce relevant, measurable, responsive, and resourced indications on safety or effectiveness for its intended mental health care and/or suicide prevention purpose. RCTs can bolster these claims by providing randomization which decreases bias and offers a rigorous tool to examine cause-effect relationships between an intervention and outcome. However, a successful RCT may not be required to demonstrate safety and effectiveness.

4.2. Secondary Findings of Empirical Literature

Robust stakeholder engagement is required to ensure there is responsiveness to needs and to gain support for DMH implementation. The previous review noted the existence of targeted strategies to serve young people in mental health care [13,42]. Although the evidence synthesis found more of a focus on adults, there was a slightly higher number of targeted strategies for young people. However, there is a need for more effective qualitative strategies such as in designing and implementing youth-oriented tailored solutions [68] and implementing a centralized DMH platform to improve stakeholder accessibility [74]. The previous review of Spadaro et al. [51] suggested overhauling the application of codesign, behavior theories, and clinical evaluation. In line, a qualitative study that evaluated DMHIs on the Innowell DMH platform articulated some implementation problems: restricted access, siloed services, interventions that are poorly matched to service users’ needs, underuse of personal outcome monitoring to track progress, exclusion of family and carers, and suboptimal experiences of care [75]. A consequential evaluation of the Innowell DMH platform led to the finding that national scalability is hindered by human factors—the main problem is not the technology but the humans that implement and use it [63]. This is in line with previous findings about the constraints in instructing the recipients of technologies [5] and transforming clinicians’ strong interest in using technology to actual use [4].
A previous review found human-centered design is important for the codesign process to instill an understanding of how DMH platforms can be used with engaging effectiveness [47]. However, human-centered design is often not implemented well in DMH services. The evidence for HCI issues was in line; for example, the relationship between the UI of a DMH platform and treatment effectiveness was unclear [73]. Furthermore, the results indicate that young people who perceived DMH platforms as useful in blended care were more willing to use the system in the future [69,79]. The results with the Innowell DMH platform suggested that codesign is not a foolproof method to increasing effectiveness with DMH platforms [63]. Previous findings on the need for key stakeholder and user input [3] were echoed in addition to the call for funding and resources to expand regional case studies to the state level and beyond.

4.3. Future Research Implications and Prospects

International collaborations were proposed for the Australian DMH platform MOST+ to be adapted, translated and developed in a digital transdiagnostic clinical–and peer-moderated treatment trial with youth in the Netherlands [81]. Although designed to serve adult Australians, MindSpot is a clinically validated, vetted DMH platform that has provided free psychological screening and ICBT treatment for anxiety, depression, mental well-being, and general distress to more than 500,000 users [78]. Successful engagement strategies were noted as required to increase the number of registered users to provide generalizability of the effectiveness of its ICBT program. However, there is a limit to the number of users that MindSpot can treat at a given time, so engagement strategies need to be tailored with this in mind. The most evidenced example of effectiveness is from an ICBT service for treating anxiety and depression in UK adults using the globally available, clinically validated SilverCloud DMH platform [76]. The organizational ability to increase registered engagement within the treatment capacity is an important issue the domain is grappling with.
Qualitative studies focused on increasing innovative engagement, usability and quality with adults suffering primarily from anxiety and depression may be a progressive next step to gather more evidence for the field. Then, it may be possible to translate findings and reevaluate (cost-)effective ways of targeting young people in mental health care and suicide prevention. It is already apparent from the preliminary evidence on serving young people that there are issues with DMH platform accessibility [74] and DMHI implementation [75]. Furthermore, there are issues in the need for parental consent/involvement, as well as higher uptake and engagement through frequent screening especially in adolescents [65].
The results were mostly derived from multidisciplinary databases—Scopus, ScienceDirect, and Sage. There is potential for future multidisciplinary research to focus on developing an understanding of what is technically required for an eminent DMH platform and how this can be applied with DMHIs. It can be surmised that an integrated-multifunctional DMH platform would be best used to demonstrate how to grow an ecosystem. For example, the LAMP DMH platform may be integrated with other systems and combines innovation, research and clinical interventions (e.g., assessment via surveys and sensors, digital phenotyping, self-management tools, data sharing with patients, and clinician support). LAMP is also linked to a consortium that provides education and collaboration between mental health practitioners and users to enable translational research [82].
The human factors problem noted in this review may benefit from a better understanding of interprofessional dynamics. An interprofessional approach appears necessary to promote mental health and well-being through various means including engagement, assessment, and intervention. It may help to investigate if the domain is encountering barriers that include competition among the various professionals, thus hindering effective outcomes. It could therefore be a prospect to incorporate a model of expertise-based care into the domain. For example, through combining interprofessional values and ethics, common and encompassing respect, in addition to privacy and confidentiality in service delivery.

5. Strengths and Limitations

This study is strengthened by a systematic approach applied to two different operationalizations of evaluating DMH platforms, i.e., the use of DMH platforms as well as how they are applied in DMHIs. The body of knowledge was synthesized to point towards the aims of the review. Next, the results were tabled and clearly presented. Thereafter, the discussion compared the previous body of knowledge with the results, integrating the complexities and challenges of evaluating DMH platforms and DMHIs as well as opportunities for increasing the evaluative impact of the domain. Furthermore, this review strengthens the knowledge base by clearly pointing to which types of DMH platforms and DMHIs as well as study designs may best help advance the domain.
Although a thorough effort was made to confirm the rigor of the search strategy, potentially appropriate studies may not have been identified if the authors of that journal article did not use the search keywords that were included in this review. For example, we may have missed articles that used alternate forms of “digital platform” AND “mental health care” OR “suicide prevention” such as “operating system” AND “psychological interventions” OR “crisis support”. It may help future systematic reviews to be consistent with the search terminology. Although, a wider range of alternate search terminology may be necessary as the domain advances. In addition, limiting the search to include journal articles published in the English language may also have excluded relevant studies in other languages.
A potential limitation is that we did not include studies focused on standalone mobile apps, AI-driven interactive/immersive/wearable technologies, social media and digital phenotyping. Bell et al. [4] included all these DMH technologies in their survey of young people’s and clinicians’ access to, use of and interest in various technologies and their applications. However, we focused on the evaluation of DMH platforms or DMHIs applied on DMH platforms. The rationale is based on the assessment of the body of knowledge that digital platforms are the most used technologies [17]. Although there is yet to be global eminence in the use of DMH platforms [46], it is established that there are a very large number of apps lacking in clinical testing [55,56]. The other excluded technologies are interesting. However, a line needed to be drawn regarding efficacy, safety and quality. The DMH platform types that we focused on (i.e., tele-mental health, self-guided and/or guided therapy, as well as multifunctional and/or integrated) were also described by Bell et al. [4]. Although, some of the terminologies were slightly different.
The review is compliant with PRISMA-ScR and included a quality appraisal, although there was no critical appraisal of individual sources of evidence because this was not in the research aim. It may be appropriate for future research to include PRISMA-ScR Item 12 ‘critical appraisal of individual sources of evidence’ including a rationale, a description of the methods used and how this information was used in any data synthesis.

6. Conclusions

This scoping review evaluated the varied use of DMH platforms and DMHIs for managing and assisting mental health care and suicide prevention. Although there is a need to decrease heterogeneity, and increase the number of significant findings, the review highlighted the promise of several usable, quality DMH platforms. A scalable DMH platform applying ICBT for treating adults’ anxiety and depression is currently the most reliable example of effectiveness. A notable challenge is implementing targeted strategies such as engaging independent young people.

Author Contributions

Conceptualization, L.B.; investigation (i.e., screening, data extraction and full-text review with the inclusion/exclusion criteria), L.B.; quality appraisal and consultation, L.B. and D.D.L.; writing—original draft preparation, L.B.; writing—review and editing, L.B. and D.D.L.; supervision, D.D.L.; project administration, L.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare that they have no conflict of interest.

Abbreviations

ACMAssociation for Computing Machinery
AIartificial intelligence
CBTcognitive behavior therapy
CD-ROMCompact Disc Read-Only Memory
DMHdigital mental health
DTCTMHdirect to consumer tele-mental health
DMHI/sdigital mental health intervention/s
HCIhuman–computer interaction
IAPTImproving Access to Psychological Therapies
ICBTInternet-delivered cognitive behavioral therapy
LAMPLearn, Assess, Manage, Prevent
LGBTQA+lesbian, gay, bisexual, transgender, queer/questioning, asexual
MOSTModerated Online Social Therapy
PRISMA-ScRPreferred Reporting Items for Systematic Reviews and Meta-Analyses—extension for Scoping Reviews
RCT/srandomized controlled trial/s
SPARXSmart, Positive, Active, Realistic, X-Factor Thoughts
TECCtechnology-enabled care coordination
TENThe Essential Network
UIuser interface
UKUnited Kingdom
USAUnited States of America
USDUnited States dollars
UXuser experience
WHOWorld Health Organization

Appendix A

Appendix A.1. Definitions

Digital platforms are technology programs delivered online through web-based and mobile applications or both (i.e., cross-platforms) [9]. Digital platforms may provide multimodal functions including email, phone, internet-based video conferencing or live chat [64].
DMH platforms include a virtual means to deliver mental health support, to establish trust, and to build a therapeutic alliance with users over time [83].
Types of DMH platforms: online self-guided therapy, online guided therapy, tele-mental health, multifunctional and/or integrated [47].
Online self-guided therapy is designed to be used online without professional guidance [84].
Online guided therapy includes some form of human support to increase engagement and/or outcomes as well as online tools deployed as adjuncts to traditional treatments to increase their effectiveness and efficiency [16].
Tele-mental health platforms provide real-time mental health care through a phone call or video conferencing when patients and practitioners are at a distance [85].
Multifunctional DMH platforms enable the user to perform several tasks simultaneously in an online service location [86]. For example, there is versatility in individual and/or practitioner use for innovation, research, treatment, self-help, care coordination, learning, collaboration and/or information distribution.
Integrated DMH platforms provide clinical integration of online services with a separate network of face-to-face mental health services [42].
Blended model of care is where DMHIs (online treatments) are combined with traditional mental health care (routine therapies) [87].
Hybrid model of care includes traditional and technology-based modes whereby the patient and practitioner choose either in-person or virtual sessions or both [88].
Stepped model of care aims to provide and monitor the most effective yet resource conservative treatment [89]. The treatment intensity varies according to the individual’s treatment need and severity of their mental health [90].
Mental health care is the service and delivery of psychological screening and testing, psychotherapy and family therapy, and neuropsychological rehabilitation provided by several fields involved in psychological assessment and intervention (e.g., psychology, psychiatry, neurology, social work) [91].
Suicide prevention is a set of efforts to counteract the risk of suicide including prevention and protective strategies for individuals, families, and communities such as information resources and training of staff [92].
Targeted strategies are approaches, interventions, plans, schemes or campaigns aimed at a population to increase engagement in mental health care including in the DMH domain [93]. For example, policy and theory development and/or service model reform to address the poor indicators of mental health outcomes for adolescent boys and young adult men.

Appendix A.2. Inclusion and Exclusion Criteria

Inclusion criteria
  • Full paper (journal article) written in English.
  • Empirical studies that described the use of DMH platforms (i.e., tele-mental health, online self-guided and/or online guided therapy, as well as multifunctional and/or integrated) through computers and/or smartphones.
  • Empirical studies that considered aspects of assessing mental health care and/or suicide prevention matters.
  • Details on the aim, DMH platform type, purpose of use and population as well as outcomes/form of evidence.
Exclusion criteria
  • Non-research articles (e.g., conference proceedings, magazines, guest editorial letters, forewords, keynotes, book reviews, posters, and workshop findings).
  • Empirical studies that did not use a digital platform as well as a mental health care and/or suicide prevention component.
  • Empirical studies focused on DMH platforms with the following components: standalone mobile app, AI-driven immersive/interactive/wearable technologies, social media and digital phenotyping.
  • Empirical studies without a report of aim and outcomes/form of evidence.

Appendix B

Table A1. PRISMA-ScR Checklist.
Table A1. PRISMA-ScR Checklist.
Section ItemPRISMA-ScR Checklist Item
Title1Identify the report as a scoping review.
Structured summary2Provide a structured summary that includes (as applicable): background, objectives, eligibility criteria, sources of evidence, charting methods, results, and conclusions that relate to the review questions and objectives.
Rationale3Describe the rationale for the review in the context of what is already known. Explain why the review questions/objectives lend themselves to a scoping review approach.
Objectives4Provide an explicit statement of the questions and objectives being addressed with reference to their key elements (e.g., population or participants, concepts, and context) or other relevant key elements used to conceptualize the review questions and/or objectives.
Protocol and registration5Indicate whether a review protocol exists; state if and where it can be accessed (e.g., a Web address); and if available, provide registration information, including the registration number.
Eligibility criteria6Specify characteristics of the sources of evidence used as eligibility criteria (e.g., years considered, language, and publication status), and provide a rationale.
Information sources *7Describe all information sources in the search (e.g., databases with dates of coverage and contact with authors to identify additional sources), as well as the date the most recent search was executed.
Search8Present the full electronic search strategy for at least 1 database, including any limits used, such that it could be repeated.
Selection of sources of evidence †9State the process for selecting sources of evidence (i.e., screening and eligibility) included in the scoping review.
Data charting process10Describe the methods of charting data from the included sources of evidence (e.g., calibrated forms or forms that have been tested by the team before their use, and whether data charting was done independently or in duplicate) and any processes for obtaining and confirming data from investigators.
Data items11List and define all variables for which data were sought and any assumptions and simplifications made.
Critical appraisal of individual sources of evidence §12If done, provide a rationale for conducting a critical appraisal of included sources of evidence; describe the methods used and how this information was used in any data synthesis (if appropriate).
Synthesis of results13Describe the methods of handling and summarizing the data that were charted.
Selection of sources of evidence14Give numbers of sources of evidence screened, assessed for eligibility, and included in the review, with reasons for exclusions at each stage, ideally using a flow diagram.
Characteristics of sources of evidence15For each source of evidence, present characteristics for which data were charted and provide the citations.
Critical appraisal within sources of evidence16If done, present data on critical appraisal of included sources of evidence (see item 12).
Results of individual sources of evidence17For each included source of evidence, present the relevant data that were charted that relate to the review questions and objectives.
Synthesis of results18Summarize and/or present the charting results as they relate to the review questions and objectives.
Summary of evidence19Summarize the main results (including an overview of concepts, themes, and types of evidence available), link to the review questions and objectives, and consider the relevance to key groups.
Limitations20Discuss the limitations of the scoping review process.
Conclusions21Provide a general interpretation of the results with respect to the review questions and objectives, as well as potential implications and/or next steps.
Funding22Describe sources of funding for the included sources of evidence, as well as sources of funding for the scoping review. Describe the role of the funders of the scoping review.
* Where sources of evidence (see second footnote) are compiled from, such as bibliographic databases, social media platforms, and Web sites. † A more inclusive/heterogeneous term used to account for the different types of evidence or data sources (e.g., quantitative and/or qualitative research, expert opinion, and policy documents) that may be eligible in a scoping review as opposed to only studies. This is not to be confused with information sources (see first footnote). § The process of systematically examining research evidence to assess its validity, results, and relevance before using it to inform a decision. This term is used for items 12 and 19 instead of "risk of bias" (which is more applicable to systematic reviews of interventions) to include and acknowledge the various sources of evidence that may be used in a scoping review (e.g., quantitative and/or qualitative research, expert opinion, and policy document). Source: [54].

References

  1. World Health Organization (WHO). Suicide Worldwide in 2019, Global Health Estimates. Geneva, World Health Organization. 2021. Available online: https://www.who.int/publications/i/item/9789240026643 (accessed on 22 February 2022).
  2. Gratzer, D.; Torous, J.; Lam, R.W.; Patten, S.B.; Kutcher, S.; Chan, S.; Yatham, L.N. Our Digital Moment: Innovations and Opportunities in Digital Mental Health Care. Can. J. Psychiatry 2020, 66, 5–8. [Google Scholar] [CrossRef] [PubMed]
  3. Torous, J.; Jän Myrick, K.; Rauseo-Ricupero, N.; Firth, J. Digital Mental Health and COVID-19: Using Technology Today to Accelerate the Curve on Access and Quality Tomorrow. JMIR Ment. Health 2020, 7, e18848. [Google Scholar] [CrossRef] [PubMed]
  4. Bell, I.H.; Thompson, A.; Valentine, L.; Adams, S.; Alvarez-Jimenez, M.; Nicholas, J. Ownership, Use of, and Interest in Digital Mental Health Technologies Among Clinicians and Young People Across a Spectrum of Clinical Care Needs: Cross-sectional Survey. JMIR Ment. Health 2022, 9, e30716. [Google Scholar] [CrossRef] [PubMed]
  5. Torous, J.; Bucci, S.; Bell, I.H.; Kessing, L.V.; Faurholt-Jepsen, M.; Whelan, P.; Firth, J. The growing field of digital psychiatry: Current evidence and the future of apps, social media, chatbots, and virtual reality. World Psychiatry 2021, 20, 318–335. [Google Scholar] [CrossRef]
  6. World Health Organization. WHO Guideline: Recommendations on Digital Interventions for Health System Strengthening. Executive Summary. Geneva: World Health Organization. 2019. Available online: http://apps.who.int/iris/bitstream/handle/10665/311941/9789241550505-eng.pdf (accessed on 2 March 2022).
  7. Wies, B.; Landers, C.; Ienca, M. Digital Mental Health for Young People: A Scoping Review of Promises and Challenges. Front. Digit. Health 2021, 3, 697072. [Google Scholar] [CrossRef] [PubMed]
  8. Research and Markets. Global Emerging Mental Health Devices and Platforms Market: Analysis and Forecast, 2021–2030. 2021. Available online: https://www.researchandmarkets.com/reports/5315021/global-emerging-mental-health-devices-and (accessed on 2 March 2022).
  9. De Witte NA, J.; Joris, S.; Van Assche, E.; Van Daele, T. Technological and Digital Interventions for Mental Health and Wellbeing: An Overview of Systematic Reviews. Front. Digit. Health 2021, 3, 754337. [Google Scholar] [CrossRef]
  10. Borghouts, J.; Eikey, E.; Mark, G.; De Leon, C.; Schueller, S.M.; Schneider, M.; Stadnick, N.; Zheng, K.; Mukamel, D.; Sorkin, D.H. Barriers to and Facilitators of User Engagement with Digital Mental Health Interventions: Systematic Review. J. Med. Internet Res. 2021, 23, e24387. [Google Scholar] [CrossRef]
  11. Andersson, G. Internet-Delivered Psychological Treatments. Annu. Rev. Clin. Psychol. 2016, 12, 157–179. [Google Scholar] [CrossRef]
  12. Scholten, H.; Granic, I. Use of the Principles of Design Thinking to Address Limitations of Digital Mental Health Interventions for Youth: Viewpoint. J. Med. Internet Res. 2019, 21, e11528. [Google Scholar] [CrossRef]
  13. Iorfino, F.; Cross, S.P.; Davenport, T.; Carpenter, J.S.; Scott, E.; Shiran, S.; Hickie, I.B. A Digital Platform Designed for Youth Mental Health Services to Deliver Personalized and Measurement-Based Care. Front. Psychiatry 2019, 10, 595. [Google Scholar] [CrossRef] [PubMed]
  14. Balcombe, L.; De Leo, D. Digital Mental Health Amid COVID-19. Encyclopedia 2021, 1, 1047–1057. [Google Scholar] [CrossRef]
  15. Titov, N.; Dear, B.F.; Staples, L.G.; Bennett-Levy, J.; Klein, B.; Rapee, R.M.; Shann, C.; Richards Nielssen, O.B. MindSpot Clinic: An Accessible, Efficient, and Effective Online Treatment Service for Anxiety and Depression. Psychiatr. Serv. 2015, 66, 1043–1050. [Google Scholar] [CrossRef] [PubMed]
  16. Schueller, S.M.; Torous, J. Scaling evidence-based treatments through digital mental health. Am. Psychol. 2020, 75, 1093–1104. [Google Scholar] [CrossRef] [PubMed]
  17. World Economic Forum. Global Governance Toolkit for Digital Mental Health: Building Trust in Disruptive Technology for Mental Health. 2021. Available online: https://www3.weforum.org/docs/WEF_Global_Governance_Toolkit_for_Digital_Mental_Health_2021.pdf (accessed on 15 March 2022).
  18. Braciszewski, J.M. Digital Technology for Suicide Prevention. Adv. Psychiatry Behav. Health 2021, 1, 53–65. [Google Scholar] [CrossRef]
  19. Balcombe, L.; De Leo, D. Digital Mental Health Challenges and the Horizon Ahead for Solutions. JMIR Ment. Health 2021, 8, e26811. [Google Scholar] [CrossRef]
  20. Kreuze, E.; Jenkins, C.; Gregoski, M.; York, J.; Mueller, M.; Lamis, D.A.; Ruggiero, K.J. Technology-enhanced suicide prevention interventions: A systematic review. J. Telemed. Telecare 2016, 23, 605–617. [Google Scholar] [CrossRef]
  21. Bachmann, S. Epidemiology of Suicide and the Psychiatric Perspective. Int. J. Environ. Res. Public Health 2018, 15, 1425. [Google Scholar] [CrossRef] [Green Version]
  22. Fowler, J.C.; Madan, A.; Bruce, C.R.; Frueh, B.C.; Kash, B.; Jones, S.L.; Sasangohar, F. Improving Psychiatric Care Through Integrated Digital Technologies. J. Psychiatr. Pract. 2021, 27, 92–100. [Google Scholar] [CrossRef]
  23. Firth, J.; Torous, J.; Nicholas, J.; Carney, R.; Rosenbaum, S.; Sarris, J. Can smartphone mental health interventions reduce symptoms of anxiety? A meta-analysis of randomized controlled trials. J. Affect. Disord. 2017, 218, 15–22. [Google Scholar] [CrossRef]
  24. Romijn, G.; Batelaan, N.; Kok, R.; Koning, J.; van Balkom, A.; Titov, N.; Riper, H. Internet-Delivered Cognitive Behavioral Therapy for Anxiety Disorders in Open Community Versus Clinical Service Recruitment: Meta-Analysis. J. Med. Internet Res. 2019, 21, e11706. [Google Scholar] [CrossRef]
  25. Firth, J.; Torous, J.; Nicholas, J.; Carney, R.; Pratap, A.; Rosenbaum, S.; Sarris, J. The efficacy of smartphone-based mental health interventions for depressive symptoms: A meta-analysis of randomized controlled trials. World Psychiatry 2017, 16, 287–298. [Google Scholar] [CrossRef] [PubMed]
  26. Bidargaddi, N.; Schrader, G.; Klasnja, P.; Licinio, J.; Murphy, S. Designing m-Health interventions for precision mental health support. Transl. Psychiatry 2020, 10, 1–8. [Google Scholar] [CrossRef] [PubMed]
  27. Bergin, A.D.; Vallejos, E.P.; Davies, E.B.; Daley, D.; Ford, T.; Harold, G.; Hollis, C. Preventive digital mental health interventions for children and young people: A review of the design and reporting of research. NPJ Digit. Med. 2020, 3, 133. [Google Scholar] [CrossRef]
  28. Davenport, T.A.; Cheng VW, S.; Iorfino, F.; Hamilton, B.; Castaldi, E.; Burton, A.; Hickie, I.B. Flip the Clinic: A Digital Health Approach to Youth Mental Health Service Delivery During the COVID-19 Pandemic and Beyond. JMIR Ment. Health 2020, 7, e24578. [Google Scholar] [CrossRef]
  29. Lattie, E.G.; Adkins, E.C.; Winquist, N.; Stiles-Shields, C.; Wafford, Q.E.; Graham, A.K. Digital Mental Health Interventions for Depression, Anxiety, and Enhancement of Psychological Well-Being Among College Students: Systematic Review. J. Med. Internet Res. 2019, 21, e12869. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  30. Lehtimaki, S.; Martic, J.; Wahl, B.; Foster, K.T.; Schwalbe, N. Evidence on Digital Mental Health Interventions for Adolescents and Young People: Systematic Overview. JMIR Ment. Health 2021, 8, e25847. [Google Scholar] [CrossRef] [PubMed]
  31. Pauley, D.; Cuijpers, P.; Papola, D.; Miguel, C.; Karyotaki, E. Two decades of digital interventions for anxiety disorders: A systematic review and meta-analysis of treatment effectiveness. Psychol. Med. 2021, 1–13. [Google Scholar] [CrossRef]
  32. Moshe, I.; Terhorst, Y.; Philippi, P.; Domhardt, M.; Cuijpers, P.; Cristea, I.; Sander, L.B. Digital interventions for the treatment of depression: A meta-analytic review. Psychol. Bull. 2021, 147, 749–786. [Google Scholar] [CrossRef]
  33. Webb, C.A.; Rosso, I.M.; Rauch, S.L. Internet-based cognitive-behavioral therapy for depression: Current progress and future directions. Harv. Rev. Psychiatry 2017, 25, 114–122. [Google Scholar] [CrossRef] [Green Version]
  34. Nebeker, C.; Bartlett Ellis, R.J.; Torous, J. Development of a decision-making checklist tool to support technology selection in digital health research. Transl. Behav. Med. 2020, 10, 1004–1015. [Google Scholar] [CrossRef]
  35. Roland, J.; Lawrance, E.; Insel, T.; Christensen, H. The Digital Mental Health Revolution: Transforming Care Through Innovation and Scale-Up. 2020. Available online: https://www.wish.org.qa/reports/the-digital-mental-health-revolution-transforming-care-through-innovation-and-scale-up/ (accessed on 22 February 2022).
  36. Baldwin, P.A.; Black, M.J.; Newby, J.M.; Brown, L.; Scott, N.; Shrestha, T.; Christensen, H. The Essential Network (TEN): Rapid development and implementation of a digital-first mental health solution for Australian healthcare workers during COVID-19. BMJ Innov. 2020, 8, 105–110. [Google Scholar] [CrossRef]
  37. Maron, E.; Baldwin, D.S.; Balõtšev, R.; Fabbri, C.; Gaur, V.; Hidalgo-Mazzei, D.; Eberhard, J. Manifesto for an international digital mental health network. Digit. Psychiatry 2019, 2, 14–24. [Google Scholar] [CrossRef] [Green Version]
  38. Himle, J.A.; Weaver, A.; Zhang, A.; Xiang, X. Digital Mental Health Interventions for Depression. Cogn. Behav. Pract. 2022, 29, 50–59. [Google Scholar] [CrossRef]
  39. Teachman, B.A.; Silverman, A.L.; Werntz, A. Digital Mental Health Services: Moving from Promise to Results. Cogn. Behav. Pract. 2022, 29, 97–104. [Google Scholar] [CrossRef]
  40. Torous, J.; Nicholas, J.; Larsen, M.E.; Firth, J.; Christensen, H. Clinical review of user engagement with mental health smartphone apps: Evidence, theory and improvements. Evid. Based Ment. Health 2018, 21, 116–119. [Google Scholar] [CrossRef]
  41. Eyre, H.A.; Berk, M.; Lavretsky, H.; Reynolds, C. (Eds.) Convergence Mental Health: A Transdisciplinary Approach to Innovation; Oxford University Press: Oxford, UK, 2021. [Google Scholar] [CrossRef]
  42. Alvarez-Jimenez, M.; Rice, S.; D’Alfonso, S.; Leicester, S.; Bendall, S.; Pryor, I.; Gleeson, J. A Novel Multimodal Digital Service (Moderated Online Social Therapy+) for Help-Seeking Young People Experiencing Mental Ill-Health: Pilot Evaluation Within a National Youth E-Mental Health Service. J. Med. Internet Res. 2020, 22, e17155. [Google Scholar] [CrossRef] [PubMed]
  43. Alvarez-Jimenez, M.; Bendall, S.; Lederman, R.; Wadley, G.; Chinnery, G.; Vargas, S.; Gleeson, J.F. On the HORYZON: Moderated online social therapy for long-term recovery in first episode psychosis. Schizophr. Res. 2013, 143, 143–149. [Google Scholar] [CrossRef]
  44. McGorry, P.D.; Mei, C.; Chanen, A.; Hodges, C.; Alvarez-Jimenez, M.; Killackey, E. Designing and scaling up integrated youth mental health care. World Psychiatry 2022, 21, 61–76. [Google Scholar] [CrossRef]
  45. Richards, D.; Timulak, L.; O’Brien, E.; Hayes, C.; Vigano, N.; Sharry, J.; Doherty, G. A randomized controlled trial of an internet-delivered treatment: Its potential as a low-intensity community intervention for adults with symptoms of depression. Behav. Res. Ther. 2015, 75, 20–31. [Google Scholar] [CrossRef] [PubMed]
  46. Balcombe, L.; De Leo, D. An Integrated Blueprint for Digital Mental Health Services Amidst COVID-19. JMIR Ment. Health 2020, 7, e21718. [Google Scholar] [CrossRef]
  47. Balcombe, L.; De Leo, D. Human-Computer Interaction in Digital Mental Health. Informatics 2022, 9, 14. [Google Scholar] [CrossRef]
  48. Balcombe, L.; De Leo, D. The Potential Impact of Adjunct Digital Tools and Technology to Help Distressed and Suicidal Men: An Integrative Review. Front. Psychol. 2022, 12, 796371. [Google Scholar] [CrossRef] [PubMed]
  49. Muñoz, R.F.; Chavira, D.A.; Himle, J.A.; Koerner, K.; Muroff, J.; Reynolds, J.; Schueller, S.M. Digital apothecaries: A vision for making health care interventions accessible worldwide. mHealth 2018, 4, 18. [Google Scholar] [CrossRef] [PubMed]
  50. Ćosić, K.; Popović, S.; Šarlija, M.; Kesedžić, I.; Gambiraža, M.; Dropuljić, B.; Jovanovic, T. AI-Based Prediction and Prevention of Psychological and Behavioral Changes in Ex-COVID-19 Patients. Front. Psychol. 2021, 12, 782866. [Google Scholar] [CrossRef]
  51. Spadaro, B.; Martin-Key, N.A.; Bahn, S. Building the Digital Mental Health Ecosystem: Opportunities and Challenges for Mobile Health Innovators. J. Med. Internet Res. 2021, 23, e27507. [Google Scholar] [CrossRef]
  52. Arksey, H.; O’Malley, L. Scoping studies: Towards a methodological framework. Int. J. Soc. Res. Methodol. 2005, 8, 19–32. [Google Scholar] [CrossRef] [Green Version]
  53. Munn, Z.; Peters, M.D.J.; Stern, C.; Tufanaru, C.; McArthur, A.; Aromataris, E. Systematic review or scoping review? Guidance for authors when choosing between a systematic or scoping review approach. BMC Med. Res. Methodol. 2018, 18, 143. [Google Scholar] [CrossRef]
  54. Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Straus, S.E. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef] [Green Version]
  55. Torous, J.; Roberts, L.W. Needed innovation in digital health and smartphone applications for mental health: Transparency and trust. JAMA Psychiatry 2017, 74, 437–438. [Google Scholar] [CrossRef]
  56. Winkle, B.V.; Carpenter, N.; Moscucci, M. Why aren’t our digital solutions working for everyone? AMA J. Ethics 2017, 19, 1116–1124. [Google Scholar] [CrossRef] [Green Version]
  57. Denny, E.; Weckesser, A. Qualitative research: What it is and what it is not. BJOG Int. J. Obstet. Gy. 2019, 126, 369. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  58. Alvarez-Jimenez, M.; Koval, P.; Schmaal, L.; Bendall, S.; Gleeson, J. The Horyzons project: A randomized controlled trial of a novel online social therapy to maintain treatment effects from specialist first-episode psychosis services. World Psychiatry Off. J. World Psychiatr. Assoc. WPA 2021, 20, 233–243. [Google Scholar] [CrossRef] [PubMed]
  59. Baumel, A.; Tinkelman, A.; Mathur, N.; Kane, J.M. Digital Peer-Support Platform (7Cups) as an Adjunct Treatment for Women with Postpartum Depression: Feasibility, Acceptability, and Preliminary Efficacy Study. JMIR Mhealth Uhealth 2018, 6, e38. [Google Scholar] [CrossRef] [PubMed]
  60. Boucher, E.M.; McNaughton, E.C.; Harake, N.; Stafford, J.L.; Parks, A.C. The Impact of a Digital Intervention (Happify) on Loneliness During COVID-19: Qualitative Focus Group. JMIR Ment. Health 2021, 8, e26617. [Google Scholar] [CrossRef] [PubMed]
  61. Craig, S.L.; Leung VW, Y.; Pascoe, R.; Pang, N.; Iacono, G.; Austin, A.; Dillon, F. AFFIRM Online: Utilising an Affirmative Cognitive–Behavioural Digital Intervention to Improve Mental Health, Access, and Engagement among LGBTQA+ Youth and Young Adults. Int. J. Environ. Res. Public Health 2021, 18, 1541. [Google Scholar] [CrossRef] [PubMed]
  62. Johansson, O.; Bjärehed, J.; Andersson, G.; Carlbring, P.; Lundh, L.-G. Effectiveness of guided internet-delivered cognitive behavior therapy for depression in routine psychiatry: A randomized controlled trial. Internet Interv. 2019, 17, 100247. [Google Scholar] [CrossRef] [PubMed]
  63. LaMonica, H.M.; Iorfino, F.; Lee, G.Y.; Piper, S.; Occhipinti, J.-A.; Davenport, T.A.; Hickie, I.B. Informing the Future of Integrated Digital and Clinical Mental Health Care: Synthesis of the Outcomes from Project Synergy. JMIR Ment. Health 2022, 9, e33060. [Google Scholar] [CrossRef]
  64. Marcelle, E.T.; Nolting, L.; Hinshaw, S.P.; Aguilera, A. Effectiveness of a Multimodal Digital Psychotherapy Platform for Adult Depression: A Naturalistic Feasibility Study. JMIR Mhealth Uhealth 2019, 7, e10948. [Google Scholar] [CrossRef]
  65. O’Dea, B.; Subotic-Kerry, M.; King, C.; Mackinnon, A.J.; Achilles, M.R.; Anderson, M.; Christensen, H. A cluster randomised controlled trial of a web-based youth mental health service in Australian schools. Lancet Reg. Health—West. Pac. 2021, 12, 100178. [Google Scholar] [CrossRef] [PubMed]
  66. Smit, D.; Vrijsen, J.N.; Groeneweg, B.; Vellinga-Dings, A.; Peelen, J.; Spijker, J. A Newly Developed Online Peer Support Community for Depression (Depression Connect): Qualitative Study. J. Med. Internet Res. 2021, 23, e25917. [Google Scholar] [CrossRef]
  67. Venning, A.; Herd, M.C.; Oswald, T.K.; Razmi, S.; Glover, F.; Hawke, T.; Redpath, P. Exploring the acceptability of a digital mental health platform incorporating a virtual coach: The good, the bad, and the opportunities. Health Inform. J. 2021, 27, 146045822199487. [Google Scholar] [CrossRef] [PubMed]
  68. Vichta, R.; Gwinner, K.; Collyer, B. What would we use and how would we use it? Can digital technology be used to both enhance and evaluate well-being outcomes with highly vulnerable and disadvantaged young people? Eval. J. Australas. 2018, 18, 222–233. [Google Scholar] [CrossRef]
  69. Bucci, S.; Morris, R.; Berry, K.; Berry, N.; Haddock, G.; Barrowclough, C.; Edge, D. Early Psychosis Service User Views on Digital Technology: Qualitative Analysis. JMIR Ment. Health 2018, 5, e10091. [Google Scholar] [CrossRef] [PubMed]
  70. Doukani, A.; Free, C.; Michelson, D.; Araya, R.; Montero-Marin, J.; Smith, S.; Kakuma, R. Towards a conceptual framework of the working alliance in a blended low-intensity cognitive behavioural therapy intervention for depression in primary mental health care: A qualitative study. BMJ Open 2020, 10, e036299. [Google Scholar] [CrossRef] [PubMed]
  71. Goldkind, L.; Wolf, L. “That’s the Beauty of It”: Practitioners Describe the Affordances of Direct to Consumer Tele-Mental Health. Fam. Soc. J. Contemp. Soc. Serv. 2021, 102, 434–449. [Google Scholar] [CrossRef]
  72. Gray, R.M.; Kelly, P.J.; Beck, A.K.; Baker, A.L.; Deane, F.P.; Neale, J.; McGlaughlin, R. A qualitative exploration of SMART Recovery meetings in Australia and the role of a digital platform to support routine outcome monitoring. Addict. Behav. 2020, 101, 106144. [Google Scholar] [CrossRef]
  73. Hentati, A.; Forsell, E.; Ljótsson, B.; Kaldo, V.; Lindefors, N.; Kraepelien, M. The effect of user interface on treatment engagement in a self-guided digital problem-solving intervention: A randomized controlled trial. Internet Interv. 2021, 26, 100448. [Google Scholar] [CrossRef]
  74. Knapp, A.A.; Cohen, K.; Nicholas, J.; Mohr, D.C.; Carlo, A.D.; Skerl, J.J.; Lattie, E.G. Integration of Digital Tools Into Community Mental Health Care Settings That Serve Young People: Focus Group Study. JMIR Ment. Health 2021, 8, e27379. [Google Scholar] [CrossRef]
  75. LaMonica, H.M.; Milton, A.; Braunstein, K.; Rowe, S.C.; Ottavio, A.; Jackson, T.; Easton, M.A.; Hambleton, A.; Hickie, I.B.; Davenport, T.A. Technology-Enabled Solutions for Australian Mental Health Services Reform: Impact Evaluation. JMIR Form. Res. 2020, 4, e18759. [Google Scholar] [CrossRef]
  76. Richards, D.; Enrique, A.; Eilert, N.; Franklin, M.; Palacios, J.; Duffy, D.; Timulak, L. A pragmatic randomized waitlist-controlled effectiveness and cost-effectiveness trial of digital interventions for depression and anxiety. NPJ Digit. Med. 2020, 3, 85. [Google Scholar] [CrossRef]
  77. Sindoni, M.G. “#YouCanTalk”: A multimodal discourse analysis of suicide prevention and peer support in the Australian BeyondBlue platform. Discourse Commun. 2019, 14, 202–221. [Google Scholar] [CrossRef]
  78. Titov, N.; Dear, B.F.; Nielssen, O.; Wootton, B.; Kayrouz, R.; Karin, E.; Staples, L.G. User characteristics and outcomes from a national digital mental health service: An observational study of registrants of the Australian MindSpot Clinic. Lancet Digit. Health 2020, 2, e582–e593. [Google Scholar] [CrossRef] [PubMed]
  79. Valentine, L.; McEnery, C.; Bell, I.; O’Sullivan, S.; Pryor, I.; Gleeson, J.; Bendall, S.; Alvarez-Jimenez, M. Blended Digital and Face-to-Face Care for First-Episode Psychosis Treatment in Young People: Qualitative Study. JMIR Ment. Health 2020, 7, e18990. [Google Scholar] [CrossRef] [PubMed]
  80. Mulder, R.; Singh, A.B.; Hamilton, A.; Das, P.; Malhi, G.S. The limitations of using randomised controlled trials as a basis for developing treatment guidelines. Evid. Based Ment. Health 2018, 21, 4–6. [Google Scholar] [CrossRef]
  81. Van Doorn, M.; Popma, A.; van Amelsvoort, T.; McEnery, C.; Gleeson, J.F.; Ory, F.G.; Nieman, D.H. Engage Young people earlY (ENYOY): A mixed-method study design for a digital transdiagnostic clinical–and peer-moderated treatment platform for youth with beginning mental health complaints in the Netherlands. BMC Psychiatry 2021, 21, 368. [Google Scholar] [CrossRef]
  82. Bilden, R.; Torous, J. Global Collaboration Around Digital Mental Health: The LAMP Consortium. J. Technol. Behav. Sci. 2022, 7, 227–233. [Google Scholar] [CrossRef]
  83. Bickmore, T.W.; Mitchell, S.E.; Jack, B.W.; Paasche-Orlow, M.K.; Pfeifer, L.M.; O’Donnell, J. Response to a relational agent by hospital patients with depressive symptoms. Interact. Comput. 2010, 22, 289–298. [Google Scholar] [CrossRef] [Green Version]
  84. Torok, M.; Han, J.; Baker, S.; Werner-Seidler, A.; Wong, I.; Larsen, M.E.; Christensen, H. Suicide prevention using self-guided digital interventions: A systematic review and meta-analysis of randomised controlled trials. Lancet Digit. Health 2020, 2, e25–e36. [Google Scholar] [CrossRef] [Green Version]
  85. Connolly, S.L.; Hogan, J.B.; Ecker, A.H.; Gloston, G.F.; Day, G.; Shore, J.H.; Lindsay, J.A. Telepsychiatry and video-to-home (including security issues). Ment. Health A Digit. World 2022, 147–167. [Google Scholar] [CrossRef]
  86. Kidd, S.A.; Feldcamp, L.; Adler, A.; Kaleis, L.; Wang, W.; Vichnevetski, K.; McKenzie, K.; Voineskos, A. Feasibility and outcomes of a multi-function mobile health approach for the schizophrenia spectrum: App4Independence (A4i). PLoS ONE 2019, 14, e0219491. [Google Scholar] [CrossRef] [Green Version]
  87. Kenter R.M., F.; van de Ven, P.M.; Cuijpers, P.; Koole, G.; Niamat, S.; Gerrits, R.S.; Willems, M.; van Straten, A. Costs and effects of Internet cognitive behavioral treatment blended with face-to-face treatment: Results from a naturalistic study. Internet Interv. 2015, 2, 77–83. [Google Scholar] [CrossRef] [Green Version]
  88. Singh, S.; Germine, L. Technology meets tradition: A hybrid model for implementing digital tools in neuropsychology. Int. Rev. Psychiatry 2021, 33, 382–393. [Google Scholar] [CrossRef] [PubMed]
  89. Richards, D.A.; Bower, P.; Pagel, C.; Weaver, A.; Utley, M.; Cape, J.; Pilling, S.; Lovell, K.; Gilbody, S.; Leibowitz, J.; et al. Delivering stepped care: An analysis of implementation in routine practice. Implement. Sci. 2012, 7, 3. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  90. van Straten, A.; Hill, J.; Richards, D.A.; Cuijpers, P. Stepped care treatment delivery for depression: A systematic review and meta-analysis. Psychol. Med. 2014, 45, 231–246. [Google Scholar] [CrossRef] [PubMed]
  91. American Psychological Association. APA Dictionary of Psychology: Mental Health Care. 2022. Available online: https://dictionary.apa.org/mental-health-care (accessed on 11 November 2022).
  92. Centers for Disease Control and Prevention. Suicide Prevention. 2022. Available online: https://www.cdc.gov/suicide/index.html (accessed on 11 November 2022).
  93. Rice, S.M.; Purcell, R.; McGorry, P.D. Adolescent and Young Adult Male Mental Health: Transforming System Failures Into Proactive Models of Engagement. J. Adolesc. Health 2018, 62, S9–S17. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Flowchart of the study selection procedure.
Figure 1. Flowchart of the study selection procedure.
Ijerph 20 00362 g001
Table 1. Overview of empirical studies that focused on evaluation of the DMH platform/s used.
Table 1. Overview of empirical studies that focused on evaluation of the DMH platform/s used.
Ref-ere-nceAuthorsStudy Design/Main AimDMH Platform (Type, Purpose of Use and Population)Outcomes/Form of EvidenceApproach/Comparison
[58]Alvarez-Jimenez et al. (2021)RCT
To ascertain the feasibility, acceptability, and safety of MOST+
Integrated-multifunctional DMH platform (Horyzons, a derivative of MOST)—used for targeting early intervention for youth psychosis (n = 170) through treatment, employment and educationFeasibility, acceptability and safety—no significant effect on social functioning compared with treatment as usual. Although there were significant correlations between system use, perceived helpfulness, and a number of secondary outcome variables, e.g., increased likelihood to enroll in education/find employment or less psychosis-related visits to hospitals and emergency servicesBlended mental health care and usual primary care
[59]Baumel et al. (2018)Quantitative—survey (purposive sample)
To examine the feasibility, acceptance, and preliminary clinical outcomes of using 7Cups
Self-guided and guided therapy DMH platform (7Cups)—online self-help tools and 24/7 emotional support delivered by trained volunteers—mothers with postpartum depression (n = 19) were targeted in an adjunct treatmentFeasibility and acceptability—7Cups significantly decreased postpartum depression treatment outcomes. Although there was no significant difference compared to treatment as usualBlended mental health care and self-guided mental health care
[60]Boucher et al. (2021)Qualitative—focus group
To explore how Happify Health may be an effective tool for disseminating loneliness interventions
Self-guided and guided therapy DMH platform (Happify Health)—used to target loneliness in adults aged 18–64 years (who indicated wanting to be more connected to others when signing up to the DMH platform) (n = 11)Feasibility—preliminary evidence of effectiveness for using Happify Health in loneliness interventions. The DMH platform may be useful as a productive distractionSelf-guided and guided mental health care
[61]Craig et al. (2021)Quantitative—survey (purposive sample)
To describe the preliminary efficacy of AFFIRM Online
Guided cognitive behavior therapy (CBT)-based intervention DMH platform (AFFIRM Online)—a DMHI applying ICBT targeting LGBTQA+ youth (n = 46)Feasibility and acceptability—effectiveness in the community-based implementation of AFFIRM Online for depression and coping with stress Blended mental health care and waitlist control
[62]Johansson et al. (2019)RCT
To determine the effectiveness of using the Swedish health care system’s ICBT platform
Guided CBT-based DMH platform (Swedish health care system)—targeting depression in routine psychiatry for adult patients (n = 108) with a primary diagnosis of major depressive disorder and excluding those with postpartum onset, ongoing alcohol- or substance abuse disorder, being assessed as high-risk suicidal patient, being actively engaging in self-harm, having a current eating disorder, bipolar disorder, ongoing psychotic symptoms, or co-occurring psychotherapy Feasibility—preliminary evidence of efficacy for the Swedish health care system’s ICBT platform for treating depression in routine psychiatric care. Although there was a small study size and patients received general psychiatric care after the ICBT treatment which limits the implicationsBlended mental health care and waitlist control
[63]LaMonica et al. (2022) Qualitative—focus group
To describe 1) the codesign process of Innowell, 2) the DMH platform’s acceptance by stakeholders, and 3) evaluation to determine its impact at the level of the service user, health professional, and service
Integrated DMH platform’s (Innowell) performance indicators evaluated by representatives of stakeholders (i.e., Open Arms and headspace) for young people, Veteran and general population mental health care services (n = 84)Feasibility—stakeholders support digital health in mental health care settings and simulations of Innowell for idealized implementation conditions are promising. Although organizational readiness for change, local-level leadership, appropriateness for end users and funding models hinder integrationBlended mental health care alone
[64]Marcelle et al. (2019)Quantitative—questionnaire
To investigate the preliminary effectiveness of BetterHelp for providing psychotherapy
Multimodal psychotherapy DMH platform (BetterHelp)—active users self-reported on depression symptoms (n = 318)Effectiveness—preliminary evidence of the use of BetterHelp in the treatment of adult depression. However, experimental trials are neededBlended mental health care and usual primary care
[65]O’Dea et al. (2021)Mixed methods
To evaluate the effectiveness of Smooth Sailing for help-seeking in students
Integrated DMH platform (Smooth Sailing) pilot trial—secondary students’ symptoms of anxiety and depression were screened and linked to online self-help or in-person care with a school counselor. Parents (n = 6) and school counselors (n = 4) were interviewed for their experiences with the delivery of the Smooth Sailing service modelFeasibility, acceptability and engagement—initial support for the use of Smooth Sailing in secondary schools to identify at-risk students. Benefits include ease of DMH platform use and psychoeducation. Although it requires parental consent, a higher uptake and engagement through frequent screening as well as targeting older studentsStepped mental health care and self-guided mental health care
[66]Smit et al. (2021)Qualitative–semi-structured interviews (purposive sample)
To capture the user perspective on Depression Connect
Integrated DMH platform (Depression Connect)—experiences with an online peer support for individuals with depression (n = 15)—thematic analysisUsability and engagement—the sample of users reported the peer support DMH platform is an accessible, safe and valuable tool to share depression coping experience. However, longitudinal research is required Blended mental health care alone
[67]Venning et al. (2021)Qualitative—semi-structured interviews and focus groups
To determine what people generally thought about the look, feel, and functionality of the DMH platform
Guided CBT-based (Low Intensity Virtual Coach) DMH Platform—experiences and engagement of a convenient sample of university students (n = 16) and mental health professionals (n = 5)Acceptability and engagement—mostly negative experiences were reported indicating that the Virtual Coach was unrelatable and hard to engage with. The effectiveness of Virtual Coach DMH platforms appears to be limited due to low levels of acceptability and engagementBlended mental health care alone
[68]Vichta et al. (2018)Mixed methods
To facilitate young people’s perspective on the use and experiences of DMH platforms
An unspecified range of existing DMH platforms—interactive workshops and an online survey gathered young people’s (n = 404) perspectives on DMH platform integration into youth mental health careFeasibility—DMH platforms can assist evaluating youth wellbeing over time. Although innovative approaches are required to gain qualitative data in a way that reaches young people in their own world Blended mental health care alone
Table 2. Overview of empirical studies that focused on evaluation of the DMHI/s applied on the DMH platform/s.
Table 2. Overview of empirical studies that focused on evaluation of the DMHI/s applied on the DMH platform/s.
Ref-ere-nceAuthorsStudy Design/Main AimDMH Platform (Type, Purpose of Use and Population)Outcomes/Form of EvidenceApproach/Comparison
[69]Bucci et al. (2018)Qualitative—semi-structured interviews (purposive sample)
To assess the feasibility and acceptability of Actissist, a digital health intervention
Guided CBT-based DMH platform intervention (Actissist) targeting youth psychosis—early psychosis service user (n = 21) perspectives Feasibility and acceptability—largely positive views on the use of DMHIs for health care delivery. Although there are concerns over privacy and data security Blended mental health care alone
[70]Doukani et al. (2020)Qualitative—semi-structured interviews (purposive sample)
To examine the working alliance demands and adapt a conceptual framework to an intervention for depression
Guided CBT-based DMH platform intervention as part of E-compared trial—interviews of people with major depressive disorder (n = 19) to investigate design of the working allianceFeasibility, usability and engagement—study is the first to offer a preliminary conceptual framework of the working alliance in ICBT for depression including how to establish, plan and promote a user-practitioner relationship in engagement strategies for technological design and clinical practice deliveryBlended mental health care alone
[71]Goldkind and Wolf (2021)Qualitative—interviews (purposive sample)
To ask practitioners to describe their lived experience of providing tele-mental health services
Direct to consumer tele-mental health (DTCTMH) platforms (unspecified)—affordances of social work practitioners (n = 21)Usability and engagement—key affordances of DTCTMH platforms include accessibility, anonymity, meaningful work, autonomy, lifelong learning, and access by new populations. Although there are hindering ethical complexities and structural challengesBlended mental health care alone
[72]Gray et al. (2020)Qualitative—semi-structured interviews
To elicit participant views on using SMART Recovery for routine outcome monitoring as a standard component of a mutual support group
Self-guided and guided DMH platform (SMART Recovery) for routine outcome monitoring, i.e., mutual support in addiction recovery—adults primarily with alcohol, drug and gambling addictions or other addictions (n = 20)Feasibility—the use of SMART Recovery may complement physical, weekly group meetings. Although its use could pose a threat to in-person mutual support especially in cases with previous experience of suchSelf-guided and guided mental health care
[73]Hentati et al. (2021)RCT
To investigate differences in treatment engagement between two different user interfaces (UIs) for DMH services
Self-guided mental health problem-solving intervention DMH platform (Swedish health care system)—optimized UI versus basic UI DMH platform for the Swedish general population (n = 397)Usability and engagement –optimized UI based on user experience (UX) design principles add to treatment engagement with the DMH platform, i.e., generating more solutions to behavioral problems. Although, the self-rated usability and treatment credibility may not be affected by whether the UI is optimized or notSelf-guided mental health care alone
[74]Knapp et al. (2021)Qualitative—focus groups
To understand how digital tools can be integrated into settings that serve young people
Integrated DMH platform (centralized DMH platform to connect the clinician, young person, and young person’s family)—clinician perspectives (n = 37) on a desired integrated DMH platform to deliver mental health care for children and adolescents Feasibility—Clinicians use digital tools to increase engagement and help young people build skills, facilitate learning, and monitor symptoms. However, a centralized DMH platform is recommended to improve accessibility by securely connecting the clinician, young person, and caregivers. Tailored solutions are required to serve youth-oriented needsBlended mental health care alone
[75]LaMonica et al. (2020)Mixed methods
To systematically monitor and evaluate the impact of implementing the InnoWell DMH Platform, into Australian mental health services to facilitate its refinement and the associated service model
Integrated DMH platform (Innowell)—evaluation of Project Synergy’s impact—surveys (n = 47), semi-stuctured interviews (n = 3), and workshops with representatives from health and social policy agencies, nongovernment organizations, primary care providers, emergency services, research institutions, community groups, and people with lived experience of suicide Feasibility, acceptability and engagement—consensus that Innowell may benefit consumers and services. Although, implementation is hindered by a lack of readiness for change, e.g., technological infrastructure, digital literacy of staff and organizing who is responsible for recommending digital solutions Blended mental health care alone
[76]Richards et al. (2020)RCT
To evaluate the (cost-) effectiveness of ICBT for depression and anxiety in a pragmatic clinical trial within routine stepped care
Integrated-multifunctional DMH platform (SilverCloud)—ICBT for people with anxiety and depression disorders (n = 361), i.e., Improving Access to Psychological Therapies (IAPT) programEffectiveness—SilverCloud’s ICBT is effective in >50% of people diagnosed with anxiety and/or depression (recovered after three months), cost-effective for IAPT after 12 months Stepped mental health care and waitlist control
[77]Sindoni et al. (2020)Qualitative—case studies
To provide analyses on how identity and distance of participants are indexed by focusing on how interpersonal relations are mapped linguistically and multimodally in #YouCanTalk on the Beyond Blue DMH platform.
Integrated DMH platform (Beyond Blue) applied in a case study on multimodal discourse analysis of peer support and professional mental health care for general populations targeting anxiety, depression and suicidality. A second case study on multimodal discourse analysis was applied with the #YouCanTalk web-based social media campaign and online support forumUsability and engagement—the Beyond Blue DMH platform used direct language appropriate to target anxiety, depression and suicidality. #YouCanTalk is multimodal in terms of language, layout, modularity and content distribution, as well as pictures, infographics and videos. Although, more datasets are required to help understand how to reduce distance in mental health communicationBlended mental health care alone
[78]Titov et al. (2020)Quantitative—observational study
To provide a summary of demographic characteristics and treatment outcomes for patients registered with MindSpot over its first 7 years of operation, including service use and symptom severity, and examined trends in these characteristics over time
Integrated-multifunctional DMH platform (MindSpot)—descriptive analysis of patients’ depression, anxiety and general distress and disability symptoms as well as post-treatment satisfaction (n = 121,652 screening users and 14,503 treatment users during a 7-year study)Usability and engagement—a high assessment completion rate (78.9%); a very high rate (96.65%) of satisfaction with the MindSpot DMH platform; overall improvement in psychological symptoms sustained for 3 months after treatment; utility for a high volume DMH service. Although the relatively small size of registered sample limits generalizability Self-guided and guided mental health care
[79]Valentine et al. (2020)Qualitative—semi-structured interviews
To gain young people’s perspectives on the design and operation of a blended model of care in first-episode psychosis treatment
Integrated-multifunctional DMH platform (Horyzons, a derivative of MOST)—young people in first-episode psychosis treatment (n = 10)—perspectives on design and implementation Acceptability—young people supported blended mental health care provided it assists face-to-face treatment. Although further research is needed on efficacy of the blended care approach by evaluating impact on the therapeutic alliance, clinical and social outcomes, cost-effectiveness, and engagementBlended mental health care alone
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Balcombe, L.; De Leo, D. Evaluation of the Use of Digital Mental Health Platforms and Interventions: Scoping Review. Int. J. Environ. Res. Public Health 2023, 20, 362. https://doi.org/10.3390/ijerph20010362

AMA Style

Balcombe L, De Leo D. Evaluation of the Use of Digital Mental Health Platforms and Interventions: Scoping Review. International Journal of Environmental Research and Public Health. 2023; 20(1):362. https://doi.org/10.3390/ijerph20010362

Chicago/Turabian Style

Balcombe, Luke, and Diego De Leo. 2023. "Evaluation of the Use of Digital Mental Health Platforms and Interventions: Scoping Review" International Journal of Environmental Research and Public Health 20, no. 1: 362. https://doi.org/10.3390/ijerph20010362

APA Style

Balcombe, L., & De Leo, D. (2023). Evaluation of the Use of Digital Mental Health Platforms and Interventions: Scoping Review. International Journal of Environmental Research and Public Health, 20(1), 362. https://doi.org/10.3390/ijerph20010362

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop