Scaling Digital Health Innovation: Developing a New ‘Service Readiness Level’ Framework of Evidence
Abstract
:1. Introduction
2. Related Works
2.1. The Benefits of Scaling DHI
2.2. Challenges in Scaling of DHI
2.3. Strong Evidence Base
3. Aims and Methods
3.1. Interview Method
- Overview/briefing paper detailing the purpose of the research, emailed in advance to interviewees (N = 18). The interviewees were drawn from senior management involved in DHI in Scotland, covering finance (2), clinical care (9), service management (3), and technical (4) (interviewees are anonymized using coding to represents their key skill; F—finance, C—clinical, S—service and T—technical). Their selection covered the main institutions involved in DHI decision making, including Scottish government, NHS, Innovation group project teams. In relation to the length of DHI experience of those interviewed, 15 interviewees had over 10 years’ experience in this field, 2 had over 5 years and 1 had under 5 years, with a gender split of 50% male and 50% females. The interviews were semi-structured, encompassing their experience with DHI projects, examples of good practice, barriers to scaling and their views of how scaling might be advanced (see Appendix B for interview questions) and took place over a 24 month period (from 2018 to 2020).
- Digital audio recordings of the interviews for transcription and later thematic analysis and coding.
- Field notes undertaken during the sessions, with key words and themes highlighted.
- Reflective thematic analysis—deductive [23], which followed the process of data familiarisation, data coding to generate key themes, supported by NVivo (V12).
3.2. Framework Development Method
- Interview content mapped and linked to different stages in the DHI project lifecycle—identified through the illustrations offered by those interviewed as part of this study.
- Framework constructed in line with the existing NASA technology readiness levels (TRL) [24] framework in using the same analogy and principles.
- Service readiness levels (SRL) described using specific headline titles that were summarised from the illustrations offered and leading examples of DHI projects that had or were moving towards national scale. These SRL titles were then arranged in chronological order and titled to describe the types of activities and journey observed, as per the initial interview content. The evolution of this framework took place with title headings changing and reordered where necessary as part of the latter consultations and feedback when the SRL framework (see Appendix A) was being initially validated.
- This SRL framework was then tested and validated by N = 14 interviews with key DHI leaders to gain further feedback and detail to optimise usefulness of the framework. These interviewees included 5 who were involved in the original research as well as an additional 9 who had comparable senior management roles in relation to DHI in Scotland, and these interviews took place over 2020/21.
4. Findings
4.1. Thematic Evidence Bases
4.1.1. Service and Organisational Evidence
“there are very few people who could really change business aspects of the service…. And then there’s even fewer people who are given the authority and the capacity to actually take action. So even the people who are interested in the change and can express what the future should be like, don’t really have the means to move forward to scale” (T&X2).
“does it solve or contribute to our real challenge in the system…so people see its value?”
“you have to draw on quite a lot of know-how, I think you’ve got to have quite a range of skills to actually bring that together and to put it into something as business as usual”,
4.1.2. Clinical Evidence
“re-evaluate its position around what conditions can we make to make sure that it is safe, and that it is complying” (S&X1).
“…without this clinical backing it unlikely that even if an innovation is proved to be effective, efficient and convenient for end users it will be too difficult to mandate without clinical leadership and champions being in place to promote to their peers and drive this forward as an acceptable option” (T&X1).
4.1.3. Finance, Legal and Standards Evidence
“...you can’t look at these projects just as health projects in isolation, they have an economic dimension, education and industry dimensions, various facets to them that need to be taken into account”. (F&Y1).
4.1.4. Citizen Evidence
4.1.5. Political Evidence
4.1.6. Technology Evidence
“in terms of evidence … you’re buying a thing that isn’t connected to other things. There’s the whole interoperability.... You’re having to plug something into our existing infrastructure” (T&Y1).
4.2. Project Lifecycle Evidence
“there’s lots of different people involved at different stages” and that “the business case was only the end point of quite a long process” that “targeted multiple different structures, each with a different purpose … continual throughout. So, it’s not a position where you’re delivering a business case, which then is a surprise to people. Actually, the reality of the business case was almost decided” (C&Y5) due to decisions being made along the way with “a lot of people to convince” (C&Y2) to accumulate a sense of “confidence and assurance” (S&Y1).
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A
Appendix A.1. DHI Service Readiness Level (SRL) Descriptions
Appendix A.2. Service Readiness Levels Summary Titles
- SRL 1—Demand/Needs assessment and vision;
- SRL 2—Current state mapping;
- SRL 3—Landscape review/Horizon scanning;
- SRL 4—Future state options co-designed;
- SRL 5—Future state preferred and simulated;
- SRL 6—RWE testing;
- SRL 7—Evaluation and evidence gathering;
- SRL 8—Case for scale developed (or proposal to iterate further);
- SRL 9—Service implemented and scaled.
Appendix A.3. Discussion of Each Level
Appendix A.3.1. SRL 1—Demand/Needs Assessment and Vision
Appendix A.3.2. SRL 2—Current State Mapping
Appendix A.3.3. SRL 3—Landscape Review/Horizon Scanning
Appendix A.3.4. SRL 4—Future State Options Co-Designed
Appendix A.3.5. SRL 5—Future State Preferred and Simulated
Appendix A.3.6. SRL 6—RWE Testing
Appendix A.3.7. SRL 7—Evaluation and Evidence Gathering
Appendix A.3.8. SRL 8—Case for Scale Developed (or Proposal to Iterate Further)
Appendix A.3.9. SRL 9—Service Implemented and Scaled
Appendix B. Interview Questions (Semi-Structured)
- Can you describe your current/former role and responsibilities that you relate to?
- Have you had any previous experience implementing, commissioning or developing digital health and care (DHC also known as Technology enable care (TEC)) interventions? If so, explain the projects you have been involved in and your role and responsibilities?
- In the projects highlighted, how were these evaluated and what was the overriding impact?
- Are you more interested in the service innovation (redesign), technology innovation or business innovation or all/combination/none? Describe why and please prioritise where possible?
- Please specifically describe what the intervention/s focused on? Articulating what you think success would be/would have been for you?
- What was your main aim for this intervention?
- What technology is being used/implemented or proposed (e.g., software, communication channel, app, monitor, gadget (e.g., Wearable), information portal, social media platform etc.)? And why?
- Often benefit realisation plans are difficult to detail. In your opinion why is this?
- In an ideal world and from your perspective would you be looking for the benefit in relation to individuals and outcomes? Or ‘the system’? Or both? Or something else? If so explain.
- And from your perspective what needs to be proved to allow you to be satisfied this could be scaled?
- What evidence is important to you or those that you need to convince?
- What evidence are other key stakeholders/partners looking for? Would you say these are mutually exclusive/or opposing or can be weaved together?
- From your experience what evaluation metrics/methods have been used in the past? What ones have been the most effective and why? Reference key examples where possible.
- In the past have these methods given you enough robust evidence to make an informed decision to progress/terminate? Please give an example, if possible.
- From your perspective what are the main gaps in the current evidence methods/base? And what are the main frustrations?
- Are there any new emerging tools/guidelines/methods that you are aware of (possibly being used in other industries, countries or projects?), if so please describe?
- Is there any obvious barriers for them getting proposed/utilised for digital health and care projects?
- In your opinion how can digital health and care be evaluated to better evidence impact and benefit/dis-benefit?
- Is there a need for early evidence indicators? If so, (and aware this is dependent on the size/scale/timelines of the project), what timeframe would be beneficial to your position? (e.g., Interim evidence every 6 months to show early indicators?)
- If you had a clear sheet—what would the perfect (DHC) evaluation look like? What attributes would be essential? And what evidence needs to be delivered by the end of any project?
- Considering data is playing a more important role, is there any opportunities/barriers in using this for evidence?
References
- Udovita, P. Conceptual review on dimensions of digital transformation in modern era. Int. J. Sci. Res. Publ. 2020, 10, 520–529. [Google Scholar] [CrossRef]
- Standing, C.; Standing, S.; McDermott, M.L.; Gururajan, R.; Kiani Mavi, R. The paradoxes of telehealth: A review of the literature 2000–2015. Syst. Res. Behav. Sci. 2018, 35, 90–101. [Google Scholar] [CrossRef]
- Vial, G. Understanding digital transformation: A review and a research agenda. J. Strateg. Inf. Syst. 2019, 28, 118–144. [Google Scholar] [CrossRef]
- Karimi, J.; Walter, Z. The role of dynamic capabilities in responding to digital disruption: A factor-based study of the newspaper industry. J. Manag. Inf. Syst. 2015, 32, 39–81. [Google Scholar] [CrossRef]
- HIMSS. Digital Health: A Framework for Healthcare Transformation. 2020. Available online: https://www.gs1ca.org/documents/digital_health-affht.pdf (accessed on 7 September 2021).
- Marvel, F.A.; Wang, J.; Martin, S.S. Digital health innovation: A toolkit to navigate from concept to clinical testing. JMIR Cardio 2018, 2, e7586. [Google Scholar] [CrossRef]
- UK National Audit Office. Digital Transformation in the NHS, NAO: London. 2020. Available online: https://www.nao.org.uk/wp-content/uploads/2019/05/Digital-transformation-in-the-NHS.pdf (accessed on 7 August 2021).
- Kuipers, P.; Humphreys, J.S.; Wakerman, J.; Wells, R.; Jones, J.; Entwistle, P. Collaborative review of pilot projects to inform policy: A methodological remedy for pilotitis? Aust. N. Z. Health Policy 2008, 5, 17. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Huang, F.; Blaschke, S.; Lucas, H. Beyond pilotitis: Taking digital health interventions to the national level in China and Uganda. Glob. Health 2017, 13, 49. [Google Scholar] [CrossRef] [PubMed]
- Lennon, M.R.; Bouamrane, M.M.; Devlin, A.M.; O’connor, S.; O’donnell, C.; Chetty, U.; Mair, F.S. Readiness for delivering digital health at scale: Lessons from a longitudinal qualitative evaluation of a national digital health innovation program in the United Kingdom. J. Med. Internet Res. 2017, 19, e6900. [Google Scholar] [CrossRef]
- Labrique, A.B.; Wadhwani, C.; Williams, K.A.; Lamptey, P.; Hesp, C.; Luk, R.; Aerts, A. Best practices in scaling digital health in low and middle income countries. Glob. Health 2018, 14, 103. [Google Scholar] [CrossRef]
- Desveaux, L.; Soobiah, C.; Bhatia, R.S.; Shaw, J. Identifying and overcoming policy-level barriers to the implementation of digital health innovation: Qualitative study. J. Med. Internet Res. 2019, 21, e14994. [Google Scholar] [CrossRef] [PubMed]
- European Union. Assessing the Impact of Digital Transformation of Health Services. Luxemburg: European Union. 2019. Available online: https://ec.europa.eu/health/sites/health/files/expert_panel/docs/022_digitaltransformation_en.pdf (accessed on 7 September 2021).
- Ricciardi, W.; Pita Barros, P.; Bourek, A.; Brouwer, W.; Kelsey, T.; Lehtonen, L. How to govern the digital transformation of health services. Eur. J. Public Health 2019, 29, 7–12. [Google Scholar] [CrossRef]
- Greenhalgh, T.; Wherton, J.; Papoutsi, C.; Lynch, J.; Hughes, G.; Hinder, S.; Fahy, N.; Procter, R.; Shaw, S. Beyond adoption: A new framework for theorizing and evaluating nonadoption, abandonment, and challenges to the scale-up, spread, and sustainability of health and care technologies. J. Med. Internet Res. 2017, 19, e367. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Greenhalgh, T.; Wherton, J.; Papoutsi, C.; Lynch, J.; Hughes, G.; Hinder, S.; Shaw, S. Analysing the role of complexity in explaining the fortunes of technology programmes: Empirical application of the NASSS framework. BMC Med. 2018, 16, 66. [Google Scholar] [CrossRef] [PubMed]
- Abimbola, S.; Patel, B.; Peiris, D.; Patel, A.; Harris, M.; Usherwood, T.; Greenhalgh, T. The NASSS framework for ex post theorisation of technology-supported change in healthcare: Worked example of the TORPEDO programme. BMC Med. 2019, 17, 233. [Google Scholar] [CrossRef] [Green Version]
- James, H.M.; Papoutsi, C.; Wherton, J.; Greenhalgh, T.; Shaw, S.E. Spread, Scale-up, and Sustainability of Video Consulting in Health Care: Systematic Review and Synthesis Guided by the NASSS Framework. J. Med. Internet Res. 2021, 23, e23775. [Google Scholar] [CrossRef] [PubMed]
- Murray, E.; Hekler, E.B.; Andersson, G.; Collins, L.M.; Doherty, A.; Hollis, C.; Rivera, D.E.; West, R.; Wyatt, J.C. Evaluating digital health interventions: Key questions and approaches. Am. J. Prev. Med. 2016, 51, 843–851. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Guo, C.; Ashrafian, H.; Ghafur, S.; Fontana, G.; Gardner, C.; Prime, M. Challenges for the evaluation of digital health solutions—A call for innovative evidence generation approaches. NPJ Digit. Med. 2020, 3, 110. [Google Scholar] [CrossRef] [PubMed]
- Arksey, H.; O’Malley, L. Scoping studies: Towards a methodological framework. Int. J. Soc. Res. Methodol. 2005, 8, 19–32. [Google Scholar] [CrossRef] [Green Version]
- Rowley. Conducting research interviews. Manag. Res. Rev. 2012, 35, 260–271. [Google Scholar] [CrossRef]
- Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef] [Green Version]
- NASA. Technology Readiness Levels. 2012. Available online: https://www.nasa.gov/directorates/heo/scan/engineering/technology/technology_readiness_level (accessed on 7 September 2021).
Themes | Sub-Themes |
---|---|
Service/Organisational (400 references) | Service demand and vision Service quality Current service understanding Future preferred service transformation Service benefits and impacts expected Service change, implementation, and transferability |
Clinical (300 references) | Clinical acceptance Clinical effectiveness and better use of resources Clinical efficacy and patient safety Leadership and ownership |
Finance, legal and standards (183 references) | Cost and return on investment Value for money, including procurement approaches Affordability and sustainability Risk, benefits, liability, and standards/regulations |
Citizen (95 references) | Citizen experience Citizen demand and empowerment Citizen benefits |
Political and policy (91 references) | Strategy alignment Political guidance and sponsorship |
Technology (68 references) | Existing and disruptive technology Acceptability, usability and accessibility Interoperability, adaptability and integration |
Service Readiness Levels (SRL) | Evaluation Methods | Evidence Summary | Assurance—Exit Criteria | |
---|---|---|---|---|
SR9—Service change implemented | Normal service change control process and evaluation methods should be followed | The service is implemented into Business as Usual and will follow normal evaluation and improvement practice for refinements, support packs in place. | New service accepted as BAU—business continuity/improvement/SLAs in place. | |
SR8—Case for scale | Parallel run required between the project team and the Service implementation/change/business as usual team | The Service/BAU team must feel comfortable with the evidence before the service is onboarded in a live environment and offered at scale. | Case for Scale—Sign off by implementing organisation and national funder (often Government) | |
Process, Clinical, Economic, financial and technical evaluation substantiated with qualitative feedback from clinicians, service manager, Ehealth, finance/legal/policy execs and customers (Citizens—patients/carers, popn). | Process, finance, economic evaluation evidence including technical due diligence evidence. Implementation/Set up Pack, Blueprint and sustainability plan. Benefits realisation/impact case—as per CSF. Business continuity plan. A full business case could be built, or further proposals to allow the innovation to be transferred for more testing/iterations *—to test transferability. | Sign off by programme board SRO—to progress for national scale commitment. The SRO must be assured that all evidence is present, endorsed by boards generally that it is regarded a sound case for investment. | ||
Multiple iterations * | SR7—Evaluation and Evidence gathered | Process, Clinical control trial (RCT variations—pragmatic), CBA/ROI/Cost effectiveness/Cost consequence/Cost utility, Economic impact analysis, HTA, Surveys/interviews (Users, Clinical, service etc.) PROMS/PREMS, QALY, Comparative and consequential studies, QoL, HRQoL, EQ-5D (EuroQol—5 Dimension), Carbon footprint analysis. | Report findings on effectiveness, safety, acceptability, affordability and sustainability, comparators from current state to new service state, comparators with other regions. Test for Change report Patient data on experience and outcomes. Quality of Life, Quality of service, specific metrics driven related to outcomes and impact e.g., reduced—waiting times, bed days, falls, exasperations, Net zero—carbon emissions etc. | Sign off by project team and programme board. The SRO must be content that the evidence is sufficient to allow either for the full business case, or a subsequent proposal that evolves the DHI for further adoption testing with other health boards. |
SR6—Real World Evidence testing | Basic service, economic and financial modeling—CSF made clear. Service Simulations and blueprint/process evaluations methods considered. | Small pilots (case for testing articulated)—aggregating previous info and presenting current RWE findings. Simulation can be used at this point. Test for change (TEC) activated if required, CSF must be clear at this point. | Sign off by project team and programme board, SRO commitment demonstrated to invest resources with a pilot. | |
SR5—Future state accepted in principle | Usability/Accessibility testing/EQIA, Acceptability testing, Interviews, and surveys, Future mapping methods, Net zero contribution analysis. | High level Evidence gathered that it is/and will be generally accepted within work practices, can be used effectively with ease, is intuitive and does not cause extra work and importantly create benefits. Endorsed by a range of stakeholders (Org, Clinical, patients/citizens, political, finance/legal/standards including procurement approaches and technical aspects). | Sign off and assurance from professionals—clinical, EHealth and service staff as an acceptable future option that warrants RW testing—weighted against levels of risk/opportunity/benefit to the system. | |
Parallel and iterative | SR4—Future state (FS) options co-designed | Simulation, paper-prototype, participatory co-design workshops/insights—persona/storytelling methods and visual illustrations. | Service redesign options and digital opportunities explored, pathway reviews and opportunity options appraised. High level FS blueprint drafts. Case studies/storytelling/personas used to communicate the future state options with possibilities linked to infrastructure/interoperability implications | Sign off at professional level that the FS options have been validated, supported by patient views/feedback—senior sponsor endorsement and assurance is in place. |
SR3—Horizon scanning | Landscape/literature/market review—Market analysis; best practice, Desk research, rapid review, Interviews/Surveys, Champions | Publications/Reports on similar services and innovations—horizon scanning. Competitive analysis—past evaluation/evidence data of innovation—used, tested, implemented. Empirical evidence gathered is appropriate (systematic reviews referenced or conducted). Art of the possible articulated. | Sign off by project team that desk research best practise has been reviewed and there is assurance that an appetite at snr. Level in the organisation/system to promote change (e.g. new working). | |
SR2—Current state (CS) understood/accepted/validated | Pathway/process mapping, Interviews, and surveys, cost current service. | Baseline data, Service cost, Snr service staff views and evidence that there is a senior sponsor. | Sign off at professional level that the CS is a true representation supported by patient views and feedback | |
SR1—Demand—Problem validation and Vision | Needs and gaps analysis to identify a clear quantifiable demand/need/gaps | Demand data (ISD, NPI etc.)—testimonials/endorsement at a senior level (e.g. CEO NHS Board, CMO, Gov Director, Minister, Policy lead). Clear vision. | Sign off by SRO and funding partners |
Evidence Themes | Service Readiness Levels (SRL) Framework—Definitions | ||||||||
---|---|---|---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | |
Demand, Needs and Vision (Assessed and Validated) | Current State (CS) (Agreed and Validated) | Horizon Scanning Landscape Review | Future State (FS) (Co-design) Option Appraisal | Future State (Preferred and Validated) | Real World Evidence (RWE) Testing | Evaluation of the Pilot RWE Site (s)—Evidence Gathered | Case for Scale/(Business Case/ Proposal dev) | Implement at Scale and Improve DHI (as Required) | |
Service and Organisational evidence | Expert opinion/view/Operational Service stats—local/national (ISD). Vision statement | CS service journey map—Baseline | Publications Case studies and learning/Best practise identified | Service option appraisal (FS maps)/Service support | Preferred Future state service map and path/Comms plan and PR | Change mgmt. review/training/ IG/DSP/BRP/DPIA TP/Risks/EQIA/ Comms pack | SQ/Legal/ HR/IT/IG/IP/ Budget/Risks/ Process eval/ Org outcomes | Blueprint/Imp/ Setup/Impact plan/TM/CM/EQIA/DPIA Comms/Data plan | Strategic/Financial Commercia/Mgmt case/BAU plan—Set up pack—Imp plan/Net Zero action plan |
Clinical evidence | Universal view/Baseline demand/SPARRA–Info services/Hypothesis/ Endorsement | CS service journey map—Baseline | Publications/Case studies/Patient safety | FS maps/Multi-disciplinary testimonials/ Efficacy/Ethics | Endorsement Simulation/Leadership/ Ethics/Risk review | Leadership /Change mgmt.—workflow review RCT/marketing | Safety—CRM/CSC /Acceptability/ Effectiveness/ Patient outcomes | CLP/TM/CSA/ Adhérence/ Patient impacts /PR—marketing | Strategic case—Impact/Gov/SOP /Improvement backlog/Comms |
Finance, Legal and standards evidence | Approx. costs of demand focus—local/national and Legal/standards view | CS approx. costings—initial costs gathered (if possible) | Cost studies/ Procurement review/Total Cost Factor | Approx. costs—all options/ consequence—‘do nothing’ | Cost comparison (CS vs FS)—option review. Net zero considered | Procurement approach view/Ethics app/Economic evaluation | CCA/CEA/TCO/CBA/Affordability and Value for money review. Net Zero contribution plan | CBA/ROI/HTA/CUA/CA/CSv/ Procurment and sustainability. Net zero impact plan | Economic/Comm case/ROI/CBA/GVA NPV + Financial budgets/Net zero impact defined |
Citizen evidence | Test citizen views on hypothesis/ Target Population nrs. /Future Demand Projections | CS citizen journey map—baselined/QoL/QALY benchmarking if possible | Publications/Case studies/Best Practise identified/Personas built. | Interview data -view point /PersonasFS map/general requirements | Testimonials on FS appetite /EQIA drafted/risk review. Personas revised | Acceptance/ Accessible/ Usable/Cost to citizen/PROMS/ PREMS/Surveys | UA/UX—Usability data/CtA/QALY QoL/HRQoL PROMS/PREMS/ Survey/Interviews | EQIA/Privacy/Case studies/Benefits & Impacts/HRQoL/ User stories and personas illustrated | Strategic case/Comms and marketing campaign/Training |
Political/Policy evidence | Test Political support/Strategic alignment/Policy benefits | Policy/strategic review and priority alignment (Targets + timelines identified) macro costs—system | Political/Priority/importance/ Critical success factors/strategy review (national) | Political support/and sponsorship review—benefit plan—NZ incl. | Endorsement/ Risk review/ /Sponsor local + national level. Benefit plan | Sponsorship /Policy instrument review. Benefit checkpoint | Confirm Sponsorship/ Benefits/NPI/Net zero/EQIA/case outline | Confirm Political/Strategic buy in/CSF/EQIA—quantify social—NZ—economic/benefits | Strategic case Briefing/policy paper/Proposal/ Benefits plan/NZ contribution/Imp Plan |
Technical evidence | Tech pull or push—acceptability (consumer demands and appetite to use digital for the focused target groups–popn.) | Existing version of tech/integration/interoperability check and high-level roadmap—baseline | Publications/ Case studies/ref sites.Adaptation/ interoperability review | Tech appraisal /FS alpha dev/Infrastructure/ Integration/UA/UX/PT testing/ | HTA/FS Tech architecture map/IMTO/ Simulation and alpha prototype | Data models /Hardware/Software/UI testing Accessibility/beta dev | SSP/IG/PECR/ CE/MDR/FDA/ UA.UX/PT-Pen Tests/IP/W3C WAI/UAT/UX | Business model-costs/TCO User Numbers/HTA/ UAT/Integration plan and costs | Commercial/ Financial case/Sales Comms and PR plan. Service contract and maintenance SLA. |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hughes, J.; Lennon, M.; Rogerson, R.J.; Crooks, G. Scaling Digital Health Innovation: Developing a New ‘Service Readiness Level’ Framework of Evidence. Int. J. Environ. Res. Public Health 2021, 18, 12575. https://doi.org/10.3390/ijerph182312575
Hughes J, Lennon M, Rogerson RJ, Crooks G. Scaling Digital Health Innovation: Developing a New ‘Service Readiness Level’ Framework of Evidence. International Journal of Environmental Research and Public Health. 2021; 18(23):12575. https://doi.org/10.3390/ijerph182312575
Chicago/Turabian StyleHughes, Janette, Marilyn Lennon, Robert J. Rogerson, and George Crooks. 2021. "Scaling Digital Health Innovation: Developing a New ‘Service Readiness Level’ Framework of Evidence" International Journal of Environmental Research and Public Health 18, no. 23: 12575. https://doi.org/10.3390/ijerph182312575
APA StyleHughes, J., Lennon, M., Rogerson, R. J., & Crooks, G. (2021). Scaling Digital Health Innovation: Developing a New ‘Service Readiness Level’ Framework of Evidence. International Journal of Environmental Research and Public Health, 18(23), 12575. https://doi.org/10.3390/ijerph182312575