Abstract
Background: Dashboards have become ubiquitous in health care settings, but to achieve their goals, they must be developed, implemented, and evaluated using methods that help ensure they meet the needs of end users and are suited to the barriers and facilitators of the local context.
Objective: This scoping review aimed to explore published literature on health care dashboards to characterize the methods used to identify factors affecting uptake, strategies used to increase dashboard uptake, and evaluation methods, as well as dashboard characteristics and context.
Methods: MEDLINE, Embase, Web of Science, and the Cochrane Library were searched from inception through July 2020. Studies were included if they described the development or evaluation of a health care dashboard with publication from 2018‐2020. Clinical setting, purpose (categorized as clinical, administrative, or both), end user, design characteristics, methods used to identify factors affecting uptake, strategies to increase uptake, and evaluation methods were extracted.
Results: From 116 publications, we extracted data for 118 dashboards. Inpatient (45/118, 38.1%) and outpatient (42/118, 35.6%) settings were most common. Most dashboards had ≥2 stated purposes (84/118, 71.2%); of these, 54 of 118 (45.8%) were administrative, 43 of 118 (36.4%) were clinical, and 20 of 118 (16.9%) had both purposes. Most dashboards included frontline clinical staff as end users (97/118, 82.2%). To identify factors affecting dashboard uptake, half involved end users in the design process (59/118, 50%); fewer described formative usability testing (26/118, 22%) or use of any theory or framework to guide development, implementation, or evaluation (24/118, 20.3%). The most common strategies used to increase uptake included education (60/118, 50.8%); audit and feedback (59/118, 50%); and advisory boards (54/118, 45.8%). Evaluations of dashboards (84/118, 71.2%) were mostly quantitative (60/118, 50.8%), with fewer using only qualitative methods (6/118, 5.1%) or a combination of quantitative and qualitative methods (18/118, 15.2%).
Conclusions: Most dashboards forego steps during development to ensure they suit the needs of end users and the clinical context; qualitative evaluation—which can provide insight into ways to improve dashboard effectiveness—is uncommon. Education and audit and feedback are frequently used to increase uptake. These findings illustrate the need for promulgation of best practices in dashboard development and will be useful to dashboard planners.
International Registered Report Identifier (IRRID): RR2-10.2196/34894
doi:10.2196/59828
Keywords
Introduction
Health care systems must process and make sense of more incoming data than ever before. Understanding and acting on these data are essential to almost every aspect of the health care enterprise, from direct patient care and clinical research, in which real-time data are critical to safe, appropriate, and timely care, to the C-suite, where health systems are held financially accountable for the outcomes of their patients [
- ]. This process can be resource intensive. One large academic medical center reported expending roughly 180,000 person-hours and US $5 million dollars to prepare and report 162 quality metrics on inpatient and emergency department performance in a single year [ ]. Increasingly, business intelligence tools are used to reduce this burden by streamlining data aggregation and reporting to facilitate continuous monitoring and improvement of key metrics [ - ].Health care “dashboards,” which analyze and present dynamic data about individuals and systems in readily interpretable ways to provide high-level and current snapshots of important metrics, have become one of the most common tools in this armamentarium. In modern health systems, they are widely seen as indispensable and are commonly used for clinical management, population health management, and quality improvement [
, , ]. Despite dashboards’ ubiquity in health care, there is little research on them and how they have been used in practice [ , ]. Fundamental questions such as how they are developed, implemented, and evaluated have largely gone unexplored [ , - ]. Yet consideration of each of these stages is critical to the successful implementation of any innovation, including dashboards. Indeed, health care systems are complex entities, containing diverse stakeholders with multiple overlapping and sometimes conflicting information needs [ , ]. Consequently, the development and dissemination of a dashboard is not a straightforward or linear process, but rather has been described as an “unpredictable, messy, and iterative process” involving multiple stakeholders [ ].In this scoping review, we apply the lens of implementation science—which addresses how to improve uptake of an innovation by accounting for contextual factors of the setting—to dashboards in health care settings, asking the questions of how developers have approached the interconnected steps of development, implementation, and evaluation. Specifically, we investigate the methods used to identify factors affecting uptake, strategies used to increase uptake, and evaluation methods. With this approach, we hope to draw attention to the need for systematic approaches to dashboard development and dissemination that incorporate principles of implementation science, identify common practices, and ultimately accelerate the science of dashboards.
Methods
Overview
In this scoping review, we followed Arksey and O’Malley’s [
] and Levac et al’s [ ] frameworks for scoping review methodology to identify and map relevant literature. Methods and results are reported according to the PRISMA-ScR (Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews) checklist [ ].The conceptual framework for this review was the Generic Implementation Framework [
], which describes the core activities required for implementation of a practice. According to the Generic Implementation Framework, the process of implementation consists of three nonlinear and recursive stages: (1) identification of key factors, namely barriers and facilitators to uptake; (2) selection of strategies to increase uptake; and (3) evaluation. At the center of the process are the innovation itself and the context, which impact each of these 3 stages. In this review, we operationalized this framework as the following overarching questions:- What methods have been used to identify factors affecting dashboard uptake?
- What strategies have been used to increase uptake?
- What evaluation methods have been used?
Additionally, we investigated the topic and users of the dashboard and the context.
Study Selection and Screening
The search strategy was developed with a research librarian and previously reported [
]. Briefly, in July 2020, we searched MEDLINE, Embase, Web of Science, and the Cochrane Library databases from inception through July 2020, using key terms, medical subject headings, and Boolean operators, with no date, language, or other restrictions applied. All records were uploaded into Covidence screening software for deduplication and dual reviewer screening of titles and abstracts. All studies describing use of a dashboard within a health care setting were included for full-text independent review by 2 study team members (authors: DH, ADR, AR, and ANK; other study team members: Rebecca Goldberg, Marisa L Conte,and Oliver Jintha Gadabu).Studies were eligible if they described how a dashboard was developed, implemented, or evaluated in a health care setting; were published in English since 2018; and were used successfully in routine workflow [
]. Although we had initially planned to include any studies published in English since 2015, because of unplanned staffing and resource limitations, the inclusion criteria were updated to focus solely on the more recent years 2018‐2020. Exclusion criteria were implementation of the dashboard only in a pretesting environment and use solely for public health disease tracking or undergraduate medical education. Any disagreements on eligibility were resolved through discussion, with adjudication by a third author when needed. In the full-text screening stage, for any clinical trial registrations (eg, ClinicalTrials.gov NCT number), ClinicalTrials.gov was visited and reviewed for linked publications, which were imported into Covidence for deduplication and full-text screening.Data Extraction and Coding
A data extraction form was developed a priori [
]. In meetings, the extraction form was iteratively refined and a codebook of response options for categorical variables (eg, health care setting, dashboard purpose) was developed ( ). Data were extracted using Qualtrics.For all included dashboards, data were extracted on health care context, dashboard purpose, intended end user, and design features; methods used to identify factors affecting uptake; strategies used to increase uptake; and evaluation methods, using predefined attributes (see
for definitions of common purposes, and for full codebook definitions). For purposes of data extraction, a list of methods for identifying factors affecting uptake was informed by existing guidelines for curating health data and designing health informatics interventions for practice improvement (eg, use of theoretical frameworks, end user involvement, formative usability testing, benchmarking based on established guidelines) [ , ]. Strategies used to increase uptake were informed by the Expert Recommendations for Implementing Change strategy list [ ], a widely used compendium of implementation strategies.Dashboard purpose | Codebook definition |
Clinical purposes | |
Direct patient care | A dashboard used when providing direct, immediate care to a patient in any health care setting. |
Population health management | A dashboard used to identify patients in a clinic panel, department, or unit who are at risk for an adverse event or in need of intervention (eg, dashboard identifies patients with potentially unsafe prescribing). |
Care coordination | A dashboard that supports care coordination by pulling information from multiple data sources and allowing both the patient and a health care provider to view the dashboard, and/or allowing the patient to enter self-reported health data into the dashboard to complement electronic health record information for the clinician to use for care planning and decision-making. |
Administrative purposes | |
Performance monitoring | A dashboard that provides data on individual provider or unit/site performance. These dashboards often show performance trends over time as well as offer the user the ability to compare their performance to that of peers or to averages within their department. |
Utilization tracking | A dashboard used to provide data on health care utilization, either at the level of the patient (eg, how often they visit, how long visits take, where the patient is seen) or at the level of the department or organization (eg, services per day/month/year, services by category or unit, top services provided by cost or in a given time period). |
Resource management | A dashboard used to support resource management by providing data to support adequate staffing, ensure appropriate and adequate supplies are available, and monitor bed management and patient transfers. |
aDefinitions for the most commonly reported dashboard purposes are displayed here. All codebook definitions are reported in
.Analysis
Due to the large number of categories for dashboard purposes and end users, we created larger super-categories for these domains so data could be summarized at a higher level. For both domains, these super-categories were “administrative/nonclinical,” “clinical,” “both administrative/nonclinical and clinical,” or “research” (Tables S1 and S2 in
).For dashboard purpose, the following attributes were considered: (1) clinical (direct patient care, population health management, and care coordination) and (2) administrative (performance monitoring; utilization tracking; resource management; financial tracking; alert or best practice advisory tracking; facilitating clinical or quality registry use; supporting education or training; and facility management). Some dashboards were categorized as both clinical and administrative (eg, used for performance monitoring and population health management). Dashboards used solely to support clinical research activities were categorized as research.
For end users, the following attributes were considered: (1) nonclinical (leadership, administrators, and individuals involved in quality improvement efforts) or (2) clinical (frontline clinicians, pharmacists, clinician trainees, remote monitoring staff, and clinical research teams). Some dashboards were categorized as having both nonclinical and clinical end users (eg, used by leadership or administration and frontline clinicians).
Extracted data for variables of interest are reported as counts and percentages. Additional data on the methods used to identify factors affecting uptake, implementation strategies used, and evaluation type are reported by purpose category (eg, clinical or administrative). Data on dashboard development and design characteristics are described narratively in online appendices and summarized in the text. Citations are provided in the text for results with ≤20 references, though all extracted data are available in
for download online and can be filtered by variable of interest to identify any relevant studies.Results
Study Screening Process
A total of 3306 unique studies were identified and underwent title and abstract review; in all, 1288 articles were excluded, and the full texts of the remaining 2149 studies were screened (
). Ultimately, 116 studies that described 118 unique dashboards were included.Dashboard Characteristics and Context
Most dashboards were used in North America (79/118, 66.9%) or Europe (18/118, 15.2%, predominantly the United Kingdom;
and Table S3 in ). Seven US dashboards originated from the Veterans Affairs Health System [ - ]. The most prevalent settings were inpatient (n=45, 38.1%), outpatient clinics (n=42, 35.6%), and emergency services (n=18, 15.2%) [ , - ]; in addition, 12/118 (10.1%) were used in >1 health care setting [ - , - , , , , - ]. Frontline clinicians (97/118, 82.2%, predominantly physicians) and leadership or administrators (50/118, 42.4%) were frequent end users, often in combination (28/118, 23.7%; Table S2 in ). Patients were sometimes included as end users (12/118, 10.2%) [ - ].Characteristic | Dashboards (n=118), n (%) |
Publication year | |
2018 | 38 (32.2) |
2019 | 39 (33) |
2020 | ,41 (34.7) |
Health care setting | |
Inpatient setting | 45 (38.1) |
Outpatient clinic | 42 (35.6) |
Emergency services | 18 (15.2) |
Other setting or unclear | 12 (10.2) |
Imaging or radiology facility | 10 (8.5) |
Surgical departments | 7 (5.9) |
Clinical laboratory | 5 (4.2) |
Settings reported | |
1 setting | 106 (89.8) |
>1 setting | 12 (10.1) |
Geographic location | |
North America | 79 (66.9) |
Europe | 18 (15.2) |
Asia | 11 (9.3) |
Africa | 6 (5.1) |
Australia | 4 (3.4) |
South America | 0 (0) |
Purpose | ,|
Clinical purposes | |
Direct patient care | 47 (39.8) |
Population health management | 37 (31.4) |
Care coordination | 22 (18.6) |
Administrative purposes | |
Performance monitoring | 51 (43.2) |
Utilization tracking | 30 (25.4) |
Resource management | 22 (18.6) |
Financial tracking | 4 (3.4) |
Facility management | 0 (0) |
Other nonclinical purpose | 6 (5.1) |
Other purposes | |
Clinical trial support tool | 1 (0.8) |
Other/unclear | 0 (0) |
Number of purposes | |
1 purpose | 34 (28.8) |
2 or more purposes | 84 (71.2) |
Intended end user | ,|
Clinical end users | |
Frontline clinicians | 97 (82.2) |
Medical doctor or advanced practice provider | 76 (64.4) |
Registered nurse or medical assistant | 34 (28.8) |
Other or not specified | 38 (32.2) |
Pharmacists or pharmacy staff | 12 (10.2) |
Patients | 12 (10.2) |
Clinician trainees | 5 (4.2) |
Remote monitoring staff | 4 (3.4) |
Clinical research teams | 2 (1.7) |
Nonclinical end users | |
Leaders and/or administrators | 50 (42.4) |
Quality improvement stakeholders | 3 (2.5) |
Other | 8 (6.8) |
End user not reported | 2 (1.7) |
Number of end users | |
1 end user | 54 (45.8) |
2 or more end users | 64 (54.2) |
aOne included study from 2019 described 2 dashboards (Woo et al [
]).bOne included study from 2020 described 2 dashboards (Stevens et al [
]).cDatabases were searched in July 2020 and only studies published and indexed in the databases searched by this date were screened for inclusion.
dCharacteristics are reported by prevalence of selection of each response across dashboards without missing data for the variable of interest. As characteristics are reported by prevalence of selection, totals may be greater than 100%.
eGeographic location by country is available in Table S3 in
.fMapping of all purpose and user responses from data extraction to nonclinical, clinical, or both nonclinical and clinical groups is available in Tables S1 and S2 in
, respectively.gNonclinical purposes of education or training, facilitate use of clinical or quality registries, and tracking of alerts or best practice advisories were grouped as other. Definitions for all dashboard purposes are available in
.The purpose was categorized as solely clinical in 43/118 (36.4%), solely administrative in 54/118 (45.8%), and both clinical and administrative in 20/118 (16.9%) studies [
, , , , , , - ] ( ). The most prevalent purposes of dashboards were performance monitoring (n=51, 43.2%), direct patient care (n=47, 39.8%), population health management (n=37, 31.4%), and utilization tracking (n=30, 25.4%; ; definitions in ). However, the majority of dashboards (n=84, 71.2%) met criteria for 2 or more purposes ( ). In dashboards with purpose(s) categorized as solely administrative (n=54), most included clinical end users (44/54, 81.5%); few were used solely by nonclinical staff (8/54, 14.8%) [ , , - ]; clinical users were almost always included as end users, regardless of purpose and setting ( , cross tab of purpose × user group in Table S4 in ).Development characteristic | Overall (n=118), n (%) | Dashboard purpose group | , n (%)||
Nonclinical (n=54) | Clinical (n=43) | Both nonclinical and clinical (n=20) | ||
Methods used to identify factors affecting uptake | ||||
Use of theoretical framework | 24 (20.3) | 14 (25.9) | 8 (18.6) | 1 (5) |
End user involvement in design | 59 (50) | 26 (48.1) | 23 (43.5) | 10 (50) |
Formative usability testing | 26 (22) | 9 (16.7) | 15 (34.9) | 2 (10) |
Benchmarks or metrics informed by regulatory guidelines | 43 (36.4) | 23 (42.6) | 13 (30.2) | 7 (35) |
Software used for dashboard development | ||||
Software not reported | 72 (61) | 26 (48.1) | 30 (69.8) | 15 (75) |
Custom coding build | 14 (11.9) | 6 (11.1) | 7 (16.3) | 1 (5) |
Tableau | 10 (8.5) | 9 (16.7) | 0 (0) | 1 (5) |
Microsoft Excel | 6 (5.1) | 6 (11.1) | 0 (0) | 0 (0) |
Qlikview | 4 (3.4) | 3 (5.6) | 0 (0) | 1 (5) |
Other software reported | 21 (17.8) | 9 (16.7) | 9 (20.9) | 3 (15) |
Dashboard delivery channel | ||||
Website | 41 (34.7) | 15 (27.8) | 21 (48.8) | 4 (20) |
Embedded within the electronic health record | 17 (14.4) | 3 (5.6) | 9 (20.9) | 5 (25) |
Site intranet or SharePoint | 13 (11) | 10 (18.5) | 2 (4.6) | 1 (5) |
Shared by email | 12 (10.2) | 11 (20.4) | 0 (0) | 1 (5) |
Printed and posted in setting | 12 (10.2) | 6 (11.1) | 2 (4.6) | 4 (20) |
Software app on phone, tablet, or computer | 7 (5.9) | 4 (7.4) | 2 (4.6) | 1 (5) |
Other | 10 (8.5) | 5 (9.2) | 5 (11.6) | 0 (0) |
Delivery channel not reported | 29 (24.6) | 13 (24.1) | 8 (18.6) | 8 (40) |
Dashboard data update frequency reported | ||||
Real time | 31 (26.3) | 11 (20.4) | 13 (30.2) | 7 (35) |
Near–real time (5‐60 min) | 11 (9.3) | 4 (7.4) | 2 (4.6) | 4 (20) |
Daily | 16 (13.6) | 10 (18.5) | 4 (9.3) | 2 (10) |
Weekly, monthly, or quarterly | 12 (10.2) | 11 (20.4) | 0 (0) | 1 (5) |
Various update times | 5 (4.2) | 2 (3.7) | 1 (2.3) | 2 (10) |
Other | 5 (4.2) | 5 (9.3) | 0 (0) | 0 (0) |
Update frequency not reported or unclear | 38 (32.2) | 11 (20.4) | 23 (53.5) | 4 (20) |
aOne dashboard that was categorized as “other” rather than nonclinical, clinical, or both, is not represented in the table. This dashboard was web-based with data reported in near–real time and reported use of a theoretical framework, but it did not report any end user involvement in design; formative usability testing; benchmarks or metrics informed by regulatory guidelines; software used to develop the dashboard; or any visual elements used in the dashboard display.
bDashboard-level details on use of theory or frameworks, involvement of end users in dashboard development, formative usability testing, dashboard metrics informed by professional guidelines or by payor-specific or licensing agency–specific quality metrics, and details on software used to develop dashboards are available in Tables S5-S10 in
.cCharacteristics are reported by prevalence of selection of each response across dashboards without missing data for the variable of interest. As characteristics are reported by prevalence of selection, totals may be greater than 100%.
dSoftware responses selected for 3 or more dashboards are shown here, with software reported to be used for 2 or fewer dashboards reported as “other” in this table. Dashboard-level details are available in Table S5 in
.Dashboard Design Characteristics
The software or coding languages were reported for 46/118 dashboards (39%), with custom coding (14/118, 11.9%) [
, , , , , , - ], Tableau (10/118, 8.5%) [ , , , , - ], Microsoft Excel (6/118, 5.1%) [ , , , , , ], and Qlikview (4/118, 3.4%) [ , , , ] most commonly used ( , details available in Table S5 in ). Dashboards developed using custom coding often described use of specific programs or coding languages, including SQL, JavaScript, and CSS. Overall, dashboards were most often available to end users as websites (41/118, 34.7%) or as tools embedded directly into the electronic health record (EHR; 17/118, 14.4%) [ , , , , , , , , - ] ( , combinations reported in Table S6 in ). However, clinical dashboards were more likely to be web-based (21/43, 48.8%) or embedded in the EHR (9/43, 20.9%) [ , , - , , , ], while dashboards with administrative purposes were more likely to be shared by email (11/54, 20.4%) [ , , , , , , , , - ], available via intranet or SharePoint (10/54, 18.5%) [ , , , , , , , , , ], or posted directly within the setting (6/54, 11.1%) [ , , , , , ].Of dashboards that reported on updating frequency (80/118, 67.8%), most were updated in real time (31/118, 26.3%) or near–real time (5‐60 minutes; 11/118, 9.3%;
) [ , , , , , , , , - ]. Dashboards used solely for administrative purposes were more likely to take 24 hours or more to update (21/43, 48.8%), while the majority of clinical dashboards updated every 24 hours or less (19/43, 44.2%; ) [ , , , , , , - , , , , - ].Methods Used to Identify Factors Affecting Uptake
Half of included dashboards (59/118, 50%;
, ) described steps to engage intended end users in the design process. User involvement included dashboard metric selection, data validation, and formation of work groups to iteratively review and revise dashboard prototypes, among other strategies (Table S7 in ). Fewer dashboards described formative usability testing (26/118, 22%; , , Table S8 in ).A theoretical or quality improvement framework was used to guide dashboard development, implementation, or evaluation efforts in 24 of 118 (20.3%) dashboards (
, ). None of the frameworks were used in more than 2 studies. Reported theories and frameworks varied widely and included behavior change theories (eg, stages of change model [ ], disruptive behavior pyramid theory [ ], active choice principles [ ]), technical frameworks (Unified Theory of Acceptance and Use of Technology [ ], technology acceptance model [ ]), implementation science–specific frameworks (eg, the Consolidated Framework for Implementation Research [ ]; Expert Recommendations for Implementing Change [ ]; Reach, Effectiveness, Adoption, Implementation, and Maintenance [ ]), and a clinical governance framework [ ], among others (dashboard-level details are available in Table S10 in ).Dashboard Data Content
Nearly one-third of health care dashboards used payor or accreditation organization reporting standards or professional guidelines as dashboard benchmarks or metrics of interest (43/118, 36.4%;
, ). Dashboards designed for performance monitoring included metrics related to value-based payment and quality payment programs [ , , ], or state- or national-level reporting mandates or guidelines [ , , , ]. When used for direct patient care or population health management, clinical guidelines were often used to identify patients for intervention or guide decision support (see [ , , ] for examples; see Table S9 in for complete data). Of dashboards that reported on visual elements, tables (66/118, 55.9%), graphs (64/118, 54.2%), and color coding (61/118, 51.7%) were common display elements, as shown in .Dashboard Implementation
Most dashboards reported at least 1 implementation strategy (114/118, 96.6%;
). Common implementation strategies and representative examples in citations included: (1) educational sessions or educational materials (60/118, 50.8%), which ranged from peer-led clinician education [ , , ] to patient education on using the dashboard [ , ]; (2) audit and feedback or relay of clinical data (59/118, 50%), typically through one-on-one discussions between a clinician and a supervisor or academic detailer focused on how to improve performance or reach specific benchmarks [ , , , , , , , ]; and (3) formation of advisory boards or work groups, or engagement of stakeholders (54/118, 45.8%), which were often multidisciplinary groups of clinical staff, site leaders, and sometimes patients, who participated in dashboard development, implementation, or formative usability testing [ , , , , , , , , , , ]. Other strategies included changing the physical environment or record systems (42/118, 35.6%; eg, placement of physical reminders or relevant supplies) as well as needs assessments or efforts to identify implementation barriers and facilitators (37/118, 31.4%). Although many implementation strategies were used at similar rates across dashboards with clinical and nonclinical purposes, audit and feedback was most often used alongside administrative dashboards (34/54, 63%; ), especially those used for performance monitoring or utilization tracking. Conversely, when dashboards were used for clinical purposes, involving patients or families was more commonly reported (24/43, 55.8%), often to engage patients in shared decision-making ( , ).Characteristics of implementation or evaluation | Overall (n=118) | Purpose group | ||
Nonclinical (n=54) | Clinical (n=43) | Clinical and nonclinical (n=20) | ||
Strategies to increase uptake | ||||
Audit and provide feedback or facilitate relay of clinical data | 59 (50) | 34 (63) | 16 (37.2) | 9 (45) |
Conduct educational sessions or disseminate educational materials | 60 (50.8) | 27 (50) | 24 (55.8) | 9 (45) |
Conduct a needs assessment, identify barriers and facilitators | 37 (31.4) | 17 (31.5) | 14 (32.6) | 6 (30) |
Form advisory boards or work groups | 54 (45.8) | 26 (48.1) | 17 (39.5) | 10 (50) |
Identify champions, involve local opinion leaders | 23 (19.5) | 13 (24.1) | 6 (14) | 4 (20) |
Mandate change, institute guidelines | 33 (28.0) | 19 (35.2) | 9 (20.9) | 5 (25) |
Change teams or professional roles | 22 (18.6) | 10 (18.5) | 7 (16.3) | 5 (25) |
Change environment or record systems | 42 (35.6) | 21 (38.9) | 12 (27.9) | 9 (45) |
Involve patients and families, prepare patients to be active in care | 31 (26.3) | 4 (7.4) | 24 (55.8) | 3 (15) |
Financial incentives or disincentives | 8 (6.8) | 4 (7.4) | 3 (7) | 1 (5) |
Remind clinicians or other stakeholders | 12 (10.2) | 3 (5.6) | 6 (14) | 3 (15) |
Other strategy reported | 5 (4.2) | 2 (3.7) | 3 (7) | 0 (0) |
No adjunct implementation strategies reported | 4 (3.4) | 2 (3.7) | 1 (2.3) | 1 (5) |
Number of implementation strategies reported | ||||
0 implementation strategies | 4 (3.4) | 2 (3.7) | 1 (2.3) | 1 (5) |
1‐3 implementation strategies | 67 (56.8) | 28 (51.8) | 26 (60.5) | 12 (60) |
4‐6 implementation strategies | 37 (31.4) | 20 (37) | 13 (30.2) | 4 (20) |
7‐10 implementation strategies | 10 (8.5) | 4 (7.4) | 3 (7) | 3 (15) |
Evaluation type | ||||
Quantitative evaluations only | 60 (50.8) | 29 (53.7) | 20 (46.5) | 10 (50) |
Using dashboard/electronic health record data alone | 41 (34.7) | 25 (46.3) | 11 (25.6) | 5 (25) |
Using survey alone | 9 (7.6) | 1 (1.9) | 5 (11.6) | 3 (15) |
Using both dashboard/electronic health record and survey data | 10 (8.5) | 3 (5.6) | 4 (9.3) | 2 (10) |
Qualitative evaluations only | ||||
Using interview or focus group data | 6 (5.1) | 1 (1.9) | 5 (11.6) | 0 (0) |
Mixed method evaluations | ||||
Using both quantitative and qualitative data | 18 (15.2) | 7 (13) | 8 (18.6) | 3 (15) |
No evaluation reported | ||||
No evaluation reported | 34 (28.8) | 17 (31.5) | 10 (23.3) | 7 (35) |
aOne dashboard that was categorized as “other” rather than nonclinical, clinical, or both, is not represented in the table. This dashboard reported use of one implementation strategy (form advisory boards or workgroups) and included a quantitative evaluation with both electronic health record or dashboard data and survey data.
bImplementation strategies are reported by prevalence of selection of each strategy across included dashboards (n=118). Reported combinations of adjunct implementation strategies used will be reported separately.
cEvaluation type is reported as the combination of evaluation types selected.
Dashboard Evaluation
Most dashboards included results from an evaluation of either the dashboard’s effect, using the dashboard as a tool for measuring change, or of the dashboard as both intervention and measurement tool (84/118, 71.2%;
). Most evaluations were quantitative, using data from the dashboard or EHR alone (41/118, 34.7%), from the dashboard or EHR in combination with survey data (10/118, 8.5%) [ , , , , , , , , , ], or from surveys alone (9/118, 7.6%) [ , , , , , , , , ]. An additional 18 studies reported mixed methods evaluations, which included interviews, focus groups, or analysis of chart notes [ , , , , , - , , , , , , , , , , ]; only 6 reported results of qualitative assessments of end user perceptions of dashboards without a quantitative evaluation [ , , , - ] ( ). When dashboards had an administrative purpose, evaluations more often were conducted using dashboard/EHR data (25/54, 46.3%).Discussion
Principal Findings and Comparison With Prior Work
This scoping review of 118 dashboards used in health care settings provides an overview of the methods used to identify factors affecting uptake, strategies used to increase uptake, and evaluation methods. Creation of a dashboard does not ensure that it is used or that its aims are achieved. As with any new practice, effective development, implementation, and evaluation are interrelated steps that ultimately determine whether the practice achieves its goals, which requires careful attention to contextual factors and the content and aims of the dashboard itself. Our first principal finding is that most dashboards are foregoing steps during the development process to help ensure dashboards are suited to the needs of end users—for example, including such end users in the design process and conducting formative usability testing. Second, we have identified the most common implementation strategies used alongside dashboards, which are likely to be useful in planning future dashboard rollouts. Third, we found that 7/10 dashboards underwent an evaluation, predominantly quantitative evaluations, while only 2/10 included qualitative evaluation.
Despite the proliferation of dashboards, we found major opportunities to improve the development process of dashboards. Half of dashboards (59/118, 50%) did not involve end users in the development process and even fewer (26/118, 22%) included formative usability testing, both of which are effective strategies to improve usability and adoption [
, ]. It is recommended that dashboard developers involve stakeholders in an iterative development process and identify performance metrics that are meaningful, reliable, and timely [ , , , ]. This corroborates findings of a prior systematic review of safety dashboards, which found a minority used formative usability testing [ ], and another recent scoping review, which found that, even when completed, usability testing is often incomplete [ ]. The complexity of dashboards, exemplified by the multiplicity of purposes, end users, and settings often incorporated into a single dashboard, heightens the importance of thoughtful and deliberate usability testing in dashboard development [ , ]. When developing dashboards for clinicians, who are often overworked and burned out, usability testing will be crucial to making dashboard use as efficient and palatable as possible [ , ]. Physicians are also likely to be more engaged if health IT tools are perceived to provide direct benefit in carrying out their work [ ].We found a wide range of implementation strategies that have been paired with dashboards, often in combination: education; audit and feedback or relay of information; engagement of working groups, stakeholders, or advisory boards; changing the environment or electronic record systems; and conducting local needs assessments. Knowledge of possible implementation strategies is essential since the mere existence of a dashboard does not ensure its adoption. Prior studies involving dashboard implementation in the US Veterans Affairs health care system found most facilities used an array of implementation strategies to achieve desired quality and safety outcomes [
, ]. To improve the care of patients with cirrhosis, pairing a clinical dashboard with patient outreach was a particularly successful combination [ ]. In our review, many studies similarly leveraged multiple strategies simultaneously to support uptake of dashboards and evidence-based practices. These findings can serve as a starting point for those planning implementation of a new dashboard. Ultimately, the choice of specific implementation strategies should depend on a thorough understanding of the local barriers and facilitators, in keeping with implementation theory [ , ].It is encouraging that a large proportion of dashboards carried out at least some quantitative evaluation. Doing so likely requires little extra effort by evaluators since the necessary data may often be contained in the dashboard itself. Fewer dashboards performed qualitative evaluations, including methods like focus groups and semistructured interviews, which may add substantial value by providing deeper insights into the results of quantitative findings (the why) and point the way toward future dashboard enhancements to increase impact and sustainability [
].Strengths and Limitations
Strengths of our study include the comprehensiveness of the data elements extracted, including health care context, dashboard content and design characteristics, methods to identify factors affecting uptake, strategies used to increase uptake, and evaluation components. In addition, we included all studies in which a dashboard was implemented in a health care setting, which allowed us to capture the full scope of health care dashboards. Most prior reviews of dashboards in health care focused narrowly on specific settings [
, ], end users [ ], and purposes [ , ]. By contrast, our inclusion criteria imposed few restrictions, leading to generalizability to a wider array of settings.There are also some limitations. First, we excluded non-English publications, which limits the generalizability to other international settings. Second, our search ended in 2020 and thus represents a sample of published literature and did not capture the most recent trends in dashboards. Since our goals were not purely quantitative synthesis, this did not prohibit us from achieving the goal of broadly surveying dashboard development, implementation, and evaluation. Third, for studies in which the dashboard was not the focus (eg, when a dashboard was only a single part of a larger multicomponent intervention), the studies may not have included a complete description of the dashboard or the development, implementation, or evaluation process; thus, these elements may have been underreported.
Implications
These limitations notwithstanding, our findings have implications for implementation of dashboards and research on dashboards in health care. Given the complexity of many dashboards, often with multiple purposes, settings, and end users simultaneously, stakeholder involvement in dashboard design, metric selection and iterative usability testing will be critical to ensure smooth and efficient operability for all end users. Usability testing may be particularly important for clinical care dashboards, not only because they have the potential to impact patients, but also because clinicians are already overloaded with administrative and documentation tasks and are increasingly burned out [
- ]. Relatively simple usability testing by novices can pay dividends, with the potential to increase adoption and effectiveness [ ]. In a similar vein, dashboard evaluations should holistically consider potential impacts, including not only the performance indicator or quality measure of interest, but also those important to end users, like impact on workflow and efficiency. Finally, dashboard designers should be aware of the wide range of implementation strategies that have been used alongside dashboards and leverage implementation science and existing theory where possible to promote dashboard adoption and sustainability.Future research priorities should include a quantitative review of the impact of dashboards on performance indicators, which was not covered in this scoping review; qualitative evaluations of the impact of dashboards on job satisfaction; and comparative research on the effectiveness of different development process and implementation strategies used with dashboards. The development of best practice statements or reporting checklists for publications on dashboard design may be useful. These will help to improve our understanding of how and why implementation strategies impact the effectiveness of these efforts [
, ].Conclusions
In this scoping review of implementation practices associated with dashboards used in health care settings, we have found major opportunities to ensure that dashboards meet the needs of end users and the clinical context; identified the most common strategies used to increase uptake; and demonstrated that quantitative evaluation methods significantly outnumber qualitative methods as part of dashboard evaluations. These findings will help to ensure that planners of future dashboards take steps to maximize implementation success and clarify the agenda needed to move the science of dashboards in health care forward.
Acknowledgments
This work was supported by the US Department of Veterans Affairs (1 I50 HX003251-01) Maintaining Implementation Through Dynamic Adaptations (MIDAS; QUE 20-025; DH, JEK, JBS, PNP, AR, LJD) and the National Institutes for Diabetes and Digestive and Kidney Diseases through a K23 award (K23DK118179; JEK). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. We thank Rebecca Goldberg, Marisa L Conte, and Oliver Jintha Gadabu for their assistance with study screening and selection in the title and abstract and full-text review stages. We also thank Dr Shari Rogal for her thoughtful and invaluable feedback on use of the Expert Recommendations for Implementing Change taxonomy to guide categorization of implementation strategies.
Data Availability
Extracted data from published results are available as an online supplement (
).Authors' Contributions
DH, JBS, PNP, LJD, ZLL, and JEK conceived and designed the study. DH, ANK, AR, and ADR screened identified articles. DH, ANK, and AR extracted data from included articles. DH and JEK analyzed study data. DH, JBS, and JEK provided project management throughout the study and drafted the manuscript. PNP, ANK, AR, ADR, LJD, and ZLL provided critical revisions of the manuscript.
Conflicts of Interest
JEK has received speaking fees from the Anticoagulation Forum. All other authors declare no conflicts.
Study codebook and examples.
DOCX File, 502 KBSupplementary tables and additional findings.
DOCX File, 141 KBStudy data file.
XLSX File, 287 KBPreferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) checklist.
PDF File, 251 KBReferences
- Major hospital quality measurement sets. Agency for Healthcare Research and Quality. 2023. URL: https://www.ahrq.gov/talkingquality/measures/setting/hospitals/measurement-sets.html [Accessed 2024-01-11]
- Pross C, Geissler A, Busse R. Measuring, reporting, and rewarding quality of care in 5 nations: 5 policy levers to enhance hospital quality accountability. Milbank Q. Mar 2017;95(1):136-183. [CrossRef] [Medline]
- Quentin W, Partanen VM, Brownwood I, Klazinga N. Improving healthcare quality in Europe: characteristics, effectiveness and implementation of different strategies. 2019. URL: https://www.ncbi.nlm.nih.gov/books/NBK549260 [Accessed 2024-03-05]
- Regulatory overload: assessing the regulatory burden on health systems, hospitals and post-acute care providers. American Hospital Association. 2017. URL: https://www.aha.org/system/files/2018-02/regulatory-overload-report.pdf [Accessed 2024-11-26]
- Saraswathula A, Merck SJ, Bai G, et al. The volume and cost of quality metric reporting. JAMA. Jun 6, 2023;329(21):1840-1847. [CrossRef] [Medline]
- Rabiei R, Almasi S. Requirements and challenges of hospital dashboards: a systematic literature review. BMC Med Inform Decis Mak. Nov 8, 2022;22(1):287. [CrossRef] [Medline]
- Khairat SS, Dukkipati A, Lauria HA, Bice T, Travers D, Carson SS. The impact of visualization dashboards on quality of care and clinician satisfaction: integrative literature review. JMIR Hum Factors. May 31, 2018;5(2):e22. [CrossRef] [Medline]
- Dowding D, Randell R, Gardner P, et al. Dashboards for improving patient care: review of the literature. Int J Med Inform. Feb 2015;84(2):87-100. [CrossRef] [Medline]
- Randell R, Alvarado N, McVey L, et al. Requirements for a quality dashboard: lessons from national clinical audits. AMIA Annu Symp Proc. 2019;2019:735-744. [Medline]
- Xie CX, Chen Q, Hincapié CA, Hofstetter L, Maher CG, Machado GC. Effectiveness of clinical dashboards as audit and feedback or clinical decision support tools on medication use and test ordering: a systematic review of randomized controlled trials. J Am Med Inform Assoc. Sep 12, 2022;29(10):1773-1785. [CrossRef] [Medline]
- van de Baan FC, Lambregts S, Bergman E, Most J, Westra D. Involving health professionals in the development of quality and safety dashboards: qualitative study. J Med Internet Res. Jun 12, 2023;25:e42649. [CrossRef] [Medline]
- Kuznetsova M, Frits ML, Dulgarian S, et al. An analysis of the structure and content of dashboards used to monitor patient safety in the inpatient setting. JAMIA Open. Oct 2021;4(4):ab096. [CrossRef] [Medline]
- Vazquez-Ingelmo A, Garcia-Penalvo FJ, Theron R. Information dashboards and tailoring capabilities - a systematic literature review. IEEE Access. 2019;7:109673-109688. [CrossRef]
- Barnum TJ, Vaez K, Cesarone D. Your data looks good on a dashboard. HIMSS. 2019. URL: https://www.himss.org/resources/your-data-looks-good-dashboard [Accessed 2024-11-26]
- Shenvi E, Boxwala A, Sittig D, et al. Visualization of patient-generated health data: a scoping review of dashboard designs. Appl Clin Inform. Oct 2023;14(5):913-922. [CrossRef] [Medline]
- Almasi S, Rabiei R, Moghaddasi H, Vahidi-Asl M. Emergency department quality dashboard; a systematic review of performance indicators, functionalities, and challenges. Arch Acad Emerg Med. 2021;9(1):e47. [CrossRef] [Medline]
- Ansari B, Martin EG. Development of a usability checklist for public health dashboards to identify violations of usability principles. J Am Med Inform Assoc. Oct 7, 2022;29(11):1847-1858. [CrossRef] [Medline]
- van Elten HJ, Sülz S, van Raaij EM, Wehrens R. Big data health care innovations: performance dashboarding as a process of collective sensemaking. J Med Internet Res. Feb 22, 2022;24(2):e30201. [CrossRef] [Medline]
- Arksey H, O’Malley L. Scoping studies: towards a methodological framework. Int J Soc Res Methodol. Feb 2005;8(1):19-32. [CrossRef]
- Levac D, Colquhoun H, O’Brien KK. Scoping studies: advancing the methodology. Impl Sci. Sep 20, 2010;5:69. [CrossRef] [Medline]
- Tricco AC, Lillie E, Zarin W, et al. PRISMA extension for Scoping Reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med. Oct 2, 2018;169(7):467-473. [CrossRef] [Medline]
- Moullin JC, Dickson KS, Stadnick NA, et al. Ten recommendations for using implementation frameworks in research and practice. Impl Sci Commun. 2020;1:42. [CrossRef] [Medline]
- Helminski D, Kurlander JE, Renji AD, et al. Dashboards in health care settings: protocol for a scoping review. JMIR Res Protoc. Mar 2, 2022;11(3):e34894. [CrossRef] [Medline]
- Designing consumer health IT: a guide for developers and systems designers. Agency for Healthcare Research and Quality. 2012. URL: https://digital.ahrq.gov/sites/default/files/docs/page/designing-consumer-health-it-a-guide-for-developers-and-systems-designers.pdf [Accessed 2024-12-04]
- Obtaining and using data in practice improvement. Agency for Healthcare Research and Quality. 2022. URL: https://www.ahrq.gov/sites/default/files/wysiwyg/ncepcr/tools/healthit-advisor-handbook.pdf#page=9&zoom=100,92,100 [Accessed 2024-12-04]
- Powell BJ, Waltz TJ, Chinman MJ, et al. A refined compilation of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project. Impl Sci. Feb 12, 2015;10:21. [CrossRef] [Medline]
- Burningham Z, Jackson GL, Kelleher J, et al. The Enhancing Quality of Prescribing Practices for Older Veterans Discharged From the Emergency Department (EQUIPPED) potentially inappropriate medication dashboard: a suitable alternative to the in-person academic detailing and standardized feedback reports of traditional EQUIPPED? Clin Ther. Apr 2020;42(4):573-582. [CrossRef] [Medline]
- Fischer MJ, Kourany WM, Sovern K, et al. Development, implementation and user experience of the Veterans Health Administration (VHA) dialysis dashboard. BMC Nephrol. Apr 16, 2020;21(1):136. [CrossRef] [Medline]
- Graber CJ, Jones MM, Goetz MB, et al. Decreases in antimicrobial use associated with multihospital implementation of electronic antimicrobial stewardship tools. Clin Infect Dis. Aug 22, 2020;71(5):1168-1176. [CrossRef] [Medline]
- Harzand A, Witbrodt B, Davis-Watts ML, et al. Feasibility of a smartphone-enabled cardiac rehabilitation program in male veterans with previous clinical evidence of coronary heart disease. Am J Cardiol. Nov 1, 2018;122(9):1471-1476. [CrossRef] [Medline]
- Rogal SS, Chinman M, Gellad WF, et al. Tracking implementation strategies in the randomized rollout of a Veterans Affairs national opioid risk management initiative. Impl Sci. Jun 23, 2020;15(1):48. [CrossRef] [Medline]
- Rostata-Pesola H, Olender L, Leon N, Aytaman A, Hoffman R, Pesola GR. Hepatitis C virus screening at a Veterans Administration hospital in New York City. J Am Assoc Nurse Pract. Apr 1, 2020;33(8):646-651. [CrossRef] [Medline]
- Smith CEP, Kamal AH, Kluger M, Coke P, Kelley MJ. National trends in end-of-life care for veterans with advanced cancer in the Veterans Health Administration: 2009 to 2016. J Oncol Pract. Jun 2019;15(6):e568-e575. [CrossRef] [Medline]
- Burns JL, Hasting D, Gichoya JW, McKibben B III, Shea L, Frank M. Just in time radiology decision support using real-time data feeds. J Digit Imaging. Feb 2020;33(1):137-142. [CrossRef] [Medline]
- Chaparro JD, Hussain C, Lee JA, Hehmeyer J, Nguyen M, Hoffman J. Reducing interruptive alert burden using quality improvement methodology. Appl Clin Inform. Jan 2020;11(1):46-58. [CrossRef] [Medline]
- Gardner LA, Bray PJ, Finley E, et al. Standardizing falls reporting: using data from adverse event reporting to drive quality improvement. J Patient Saf. Jun 2019;15(2):135-142. [CrossRef] [Medline]
- Heaton HA, Russi CS, Monroe RJ, Thompson KM, Koch KA. Telehealth dashboard: leverage reporting functionality to increase awareness of high-acuity emergency department patients across an enterprise practice. BMJ Health Care Inform. Dec 2019;26(1):e100093. [CrossRef] [Medline]
- Hester G, Lang T, Madsen L, Tambyraja R, Zenker P. Timely data for targeted quality improvement interventions: use of a visual analytics dashboard for bronchiolitis. Appl Clin Inform. Jan 2019;10(1):168-174. [CrossRef] [Medline]
- Huber TC, Krishnaraj A, Monaghan D, Gaskin CM. Developing an interactive data visualization tool to assess the impact of decision support on clinical operations. J Digit Imaging. Oct 2018;31(5):640-645. [CrossRef] [Medline]
- Martinez DA, Kane EM, Jalalpour M, et al. An electronic dashboard to monitor patient flow at the Johns Hopkins Hospital: communication of key performance indicators using the Donabedian model. J Med Syst. Jun 18, 2018;42(8):133. [CrossRef] [Medline]
- Ni Y, Bermudez M, Kennebeck S, Liddy-Hicks S, Dexheimer J. A real-time automated patient screening system for clinical trials eligibility in an emergency department: design and evaluation. JMIR Med Inform. Jul 24, 2019;7(3):e14185. [CrossRef] [Medline]
- Scheinfeld MH, Feltus W, DiMarco P, Rooney K, Goldman IA. The emergency radiology dashboard: facilitating workflow with realtime data. Curr Probl Diagn Radiol. 2020;49(4):231-233. [CrossRef] [Medline]
- Schleyer TKL, Rahurkar S, Baublet AM, et al. Preliminary evaluation of the Chest Pain Dashboard, a FHIR-based approach for integrating health information exchange information directly into the clinical workflow. AMIA Jt Summits Transl Sci Proc. 2019;2019:656-664. [Medline]
- Smalley CM, Willner MA, Muir MR, et al. Electronic medical record-based interventions to encourage opioid prescribing best practices in the emergency department. Am J Emerg Med. Aug 2020;38(8):1647-1651. [CrossRef] [Medline]
- Tan A, Durbin M, Chung FR, et al. Design and implementation of a clinical decision support tool for primary palliative Care for Emergency Medicine (PRIM-ER). BMC Med Inform Decis Mak Eng. 2020;20(1):13. [CrossRef] [Medline]
- Tyler A, Krack P, Bakel LA, et al. Interventions to reduce over-utilized tests and treatments in bronchiolitis. Pediatrics. Jun 2018;141(6):e20170485. [CrossRef] [Medline]
- Matsumoto S, Koyama H, Nakahara I, et al. A visual task management application for acute ischemic stroke care. Front Neurol. 2019;10:1118. [CrossRef] [Medline]
- Schmidt T, Brabrand M, Lassen AT, Wiil UK. Design and evaluation of a patient monitoring dashboard for emergency departments. Stud Health Technol Inform Netherlands. 2019;264:788-792. [CrossRef]
- Ulaganathan S, Fatah T, Schmidt T, Nohr C. Utilization of a novel patient monitoring dashboard in emergency departments. Stud Health Technol Inform Netherlands. 2019;262:260-263. [CrossRef]
- Yoo J, Jung KY, Kim T, et al. A real-time autonomous dashboard for the emergency department: 5-year case study. JMIR mHealth uHealth. Nov 22, 2018;6(11):e10666. [CrossRef] [Medline]
- Nelson O, Sturgis B, Gilbert K, et al. A visual analytics dashboard to summarize serial anesthesia records in pediatric radiation treatment. Appl Clin Inform. Aug 2019;10(4):563-569. [CrossRef] [Medline]
- Begum T, Khan SM, Adamou B, et al. Perceptions and experiences with district health information system software to collect and utilize health data in Bangladesh: a qualitative exploratory study. BMC Health Serv Res. May 26, 2020;20(1):465. [CrossRef] [Medline]
- Patel MS, Rathi B, Tashfeen K, Yarubi MA. Development and implementation of maternity dashboard in regional hospital for quality improvement at ground level: a pilot study. Oman Med J. May 2019;34(3):194-199. [CrossRef] [Medline]
- Jung AD, Baker J, Droege CA, et al. Sooner is better: use of a real-time automated bedside dashboard improves sepsis care. J Surg Res. Nov 2018;231:373-379. [CrossRef] [Medline]
- Kumar D, Tully LM, Iosif AM, et al. A mobile health platform for clinical monitoring in early psychosis: implementation in community-based outpatient early psychosis care. JMIR Ment Health. Feb 27, 2018;5(1):e15. [CrossRef] [Medline]
- Niendam TA, Tully LM, Iosif AM, et al. Enhancing early psychosis treatment using smartphone technology: a longitudinal feasibility and validity study. J Psychiatr Res. Jan 2018;96:239-246. [CrossRef] [Medline]
- Reback CJ, Rünger D, Fletcher JB, Swendeman D. Ecological momentary assessments for self-monitoring and counseling to optimize methamphetamine treatment and sexual risk reduction outcomes among gay and bisexual men. J Subst Abuse Treat. Sep 2018;92:17-26. [CrossRef] [Medline]
- Van Citters AD, Gifford AH, Brady C, et al. Formative evaluation of a dashboard to support coproduction of healthcare services in cystic fibrosis. J Cyst Fibros. Sep 2020;19(5):768-776. [CrossRef] [Medline]
- Webers C, Beckers E, Boonen A, et al. Development, usability and acceptability of an integrated eHealth system for spondyloarthritis in the Netherlands (SpA-Net). RMD Open. 2019;5(1):e000860. [CrossRef] [Medline]
- van Zijl F, Versnel S, van der Poel EF, Baatenburg de Jong RJ, Datema FR. Use of routine prospective functional and aesthetic patient satisfaction measurements in secondary cleft lip rhinoplasty. JAMA Facial Plast Surg. Dec 1, 2018;20(6):488-494. [CrossRef] [Medline]
- Stranne J, Axen E, Franck-Lissbrant I, et al. Single institution followed by national implementation of systematic surgical quality control and feedback for radical prostatectomy: a 20-year journey. World J Urol. Jun 2020;38(6):1397-1411. [CrossRef] [Medline]
- Hoogeveen IJ, Peeks F, de Boer F, et al. A preliminary study of telemedicine for patients with hepatic glycogen storage disease and their healthcare providers: from bedside to home site monitoring. J Inherit Metab Dis. Nov 2018;41(6):929-936. [CrossRef] [Medline]
- Coventry P, Bower P, Blakemore A, et al. Satisfaction with a digitally-enabled telephone health coaching intervention for people with non-diabetic hyperglycaemia. NPJ Digit Med. 2019;2(1):5. [CrossRef] [Medline]
- Jameie S, Haybar H, Aslani A, Saadat M. Development and usability evaluation of web-based telerehabilitation platform for patients after myocardial infarction. Stud Health Technol Inform. 2019;261:68-74. [Medline]
- Ospina-Pinillos L, Davenport TA, Ricci CS, Milton AC, Scott EM, Hickie IB. Developing a mental health eClinic to improve access to and quality of mental health care for young people: using participatory design as research methodologies. J Med Internet Res. May 28, 2018;20(5):e188. [CrossRef] [Medline]
- Woo JS, Suslow P, Thorsen R, et al. Development and implementation of real-time web-based dashboards in a multisite transfusion service. J Pathol Inform. 2019;10:3. [CrossRef] [Medline]
- Stevens JS, Toma K, Tanzi-Pfeifer S, et al. Dashboards to facilitate nephrology disaster planning in the COVID-19 era. Kidney Int Rep. Aug 2020;5(8):1298-1302. [CrossRef] [Medline]
- Field M, Fong K, Shade C. Use of electronic visibility boards to improve patient care quality, safety, and flow on inpatient pediatric acute care units. J Pediatr Nurs. 2018;41:69-76. [CrossRef] [Medline]
- Khanna N, Gritzer L, Klyushnenkova E, et al. Practice transformation analytics dashboard for clinician engagement. Ann Fam Med. Aug 12, 2019;17(Suppl 1):S73-S76. [CrossRef] [Medline]
- Ruff JC, Herndon JB, Horton RA, et al. Developing a caries risk registry to support caries risk assessment and management for children: a quality improvement initiative. J Public Health Dent. Mar 2018;78(2):134-143. [CrossRef] [Medline]
- Shailam R, Botwin A, Stout M, Gee MS. Real-time electronic dashboard technology and its use to improve pediatric radiology workflow. Curr Probl Diagn Radiol. 2018;47(1):3-5. [CrossRef] [Medline]
- Twohig PA, Rivington JR, Gunzler D, Daprano J, Margolius D. Clinician dashboard views and improvement in preventative health outcome measures: a retrospective analysis. BMC Health Serv Res. Jul 11, 2019;19(1):475. [CrossRef] [Medline]
- Wang EJ, Helgesen R, Johr CR, Lacko HS, Ashburn MA, Merkel PA. Targeted program in an academic rheumatology practice to improve compliance with opioid prescribing guidelines for the treatment of chronic pain. Arthritis Care Res. Oct 2021;73(10):1425-1429. [CrossRef] [Medline]
- Barnett A, Winning M, Canaris S, Cleary M, Staib A, Sullivan C. Digital transformation of hospital quality and safety: real-time data for real-time action. Aust Health Rev. 2019;43(6):656. [CrossRef]
- Dagliati A, Sacchi L, Tibollo V, et al. A dashboard-based system for supporting diabetes care. J Am Med Inform Assoc. May 1, 2018;25(5):538-547. [CrossRef] [Medline]
- Findlay M, Rankin NM, Shaw T, et al. Best evidence to best practice: implementing an innovative model of nutrition care for patients with head and neck cancer improves outcomes. Nutrients. May 19, 2020;12(5):1465. [CrossRef] [Medline]
- Jeffries M, Gude WT, Keers RN, et al. Understanding the utilisation of a novel interactive electronic medication safety dashboard in general practice: a mixed methods study. BMC Med Inform Decis Mak. Apr 17, 2020;20(1):69. [CrossRef] [Medline]
- Kai-Hsuan Y, Kao WF, Yen-Kuang L, et al. Reducing length of stay and improving quality of care by implementation of informatics system and care bundle in the intensive care unit. RIC. Feb 21, 2020;72(1):25-31. [CrossRef]
- Mabirizi D, Phulu B, Churfo W, et al. Implementing an integrated pharmaceutical management information system for antiretrovirals and other medicines: lessons from Namibia. Glob Health Sci Pract. Dec 27, 2018;6(4):723-735. [CrossRef] [Medline]
- Rea RD, Lumb A, Tan GD, et al. Using data to improve the care of people with diabetes across Oxfordshire. Pract Diab. Jan 2020;37(1):27-31. [CrossRef]
- Connor JP, Raife T, Medow JE, Ehlenfeldt BD, Sipsma K. The blood utilization calculator, a target‐based electronic decision support algorithm, increases the use of single‐unit transfusions in a large academic medical center. Transfusion. Jul 2018;58(7):1689-1696. [CrossRef] [Medline]
- Grange ES, Neil EJ, Stoffel M, et al. Responding to COVID-19: the UW Medicine Information Technology Services experience. Appl Clin Inform. Mar 2020;11(2):265-275. [CrossRef] [Medline]
- Ahern S, Feiler R, Sdrinis S. Maximising the value of clinical registry information through integration with a health service clinical governance framework: a case study. Aust Health Rev. 2020;44(3):421. [CrossRef]
- Cassim N, Coetzee LM, Tepper MEE, Perelson L, Glencross DK. Timely delivery of laboratory efficiency information, part II: assessing the impact of a turn-around time dashboard at a high-volume laboratory. Afr J Lab Med. 2020;9(2):948. [CrossRef] [Medline]
- Cassim N, Tepper ME, Coetzee LM, Glencross DK. Timely delivery of laboratory efficiency information, part I: developing an interactive turnaround time dashboard at a high-volume laboratory. Afr J Lab Med. 2020;9(2):947. [CrossRef] [Medline]
- Choi HH, Clark J, Jay AK, Filice RW. Minimizing barriers in learning for on-call radiology residents-end-to-end web-based resident feedback system. J Digit Imaging. Feb 2018;31(1):117-123. [CrossRef] [Medline]
- Durojaiye AB, Snyder E, Cohen M, Nagy P, Hong K, Johnson PT. Radiology resident assessment and feedback dashboard. Radiographics. 2018;38(5):1443-1453. [CrossRef] [Medline]
- Frymoyer A, Schwenk HT, Zorn Y, et al. Model-informed precision dosing of vancomycin in hospitalized children: implementation and adoption at an academic children’s hospital. Front Pharmacol. 2020;11:551. [CrossRef] [Medline]
- Williams R, Keers R, Gude WT, et al. SMASH! The Salford medication safety dashboard. BMJ Health Care Inform. Jul 2018;25(3):183-193. [CrossRef]
- Laurent G, Moussa MD, Cirenei C, Tavernier B, Marcilly R, Lamer A. Development, implementation and preliminary evaluation of clinical dashboards in a department of anesthesia. J Clin Monit Comput. May 2021;35(3):617-626. [CrossRef] [Medline]
- Giordanengo A, Årsand E, Woldaregay AZ, et al. Design and prestudy assessment of a dashboard for presenting self-collected health data of patients with diabetes to clinicians: Iierative approach and qualitative case study. JMIR Diabetes. Jul 9, 2019;4(3):e14002. [CrossRef] [Medline]
- Rezaei-Hachesu P, Samad-Soltani T, Yaghoubi S, et al. The design and evaluation of an antimicrobial resistance surveillance system for neonatal intensive care units in Iran. Int J Med Inform. Jul 2018;115:24-34. [CrossRef] [Medline]
- Barbeito A, Segall N. Development and usability testing of an audit and feedback tool for anesthesiologists. JAMIA Open. Apr 2019;2(1):29-34. [CrossRef] [Medline]
- Dixit RA, Hurst S, Adams KT, et al. Rapid development of visualization dashboards to enhance situation awareness of COVID-19 telehealth initiatives at a multihospital healthcare system. J Am Med Inform Assoc. Jul 1, 2020;27(9):1456-1461. [CrossRef] [Medline]
- Dolan JE, Lonsdale H, Ahumada LM, et al. Quality initiative using theory of change and visual analytics to improve controlled substance documentation discrepancies in the operating room. Appl Clin Inform. May 2019;10(3):543-551. [CrossRef] [Medline]
- Hagaman DH, Ehrenfeld JM, Terekhov M, et al. Compliance is contagious: using informatics methods to measure the spread of a documentation standard from a preoperative clinic. J Perianesth Nurs. Aug 2018;33(4):436-443. [CrossRef] [Medline]
- Kunjan K, Doebbeling B, Toscos T. Dashboards to support operational decision making in health centers: a case for role-specific design. Int J Hum-Comput Interact. May 28, 2019;35(9):742-750. [CrossRef]
- Robinson JR, Carter NH, Gibson C, et al. Improving the value of care for appendectomy through an individual surgeon-specific approach. J Pediatr Surg. Jun 2018;53(6):1181-1186. [CrossRef] [Medline]
- Pandya ST, Chakravarthy K, Vemareddy A. Obstetric anaesthesia practice: dashboard as a dynamic audit tool. Indian J Anaesth. Nov 2018;62(11):838-843. [CrossRef] [Medline]
- Whidden C, Kayentao K, Liu JX, et al. Improving community health worker performance by using a personalised feedback dashboard for supervision: a randomised controlled trial. J Glob Health. Dec 2018;8(2):020418. [CrossRef] [Medline]
- Patel S, Rajkomar A, Harrison JD, et al. Next-generation audit and feedback for inpatient quality improvement using electronic health record data: a cluster randomised controlled trial. BMJ Qual Saf. Sep 2018;27(9):691-699. [CrossRef] [Medline]
- Bersani K, Fuller TE, Garabedian P, et al. Use, perceived usability, and barriers to implementation of a patient safety dashboard integrated within a vendor EHR. Appl Clin Inform. Jan 2020;11(1):34-45. [CrossRef] [Medline]
- Fletcher GS, Aaronson BA, White AA, Julka R. Effect of a real-time electronic dashboard on a rapid response system. J Med Syst. Jan 2018;42(1). [CrossRef]
- Fuller TE, Garabedian PM, Lemonias DP, et al. Assessing the cognitive and work load of an inpatient safety dashboard in the context of opioid management. Appl Ergon. May 2020;85:103047. [CrossRef] [Medline]
- Fuller TE, Pong DD, Piniella N, et al. Interactive digital health tools to engage patients and caregivers in discharge preparation: implementation study. J Med Internet Res. Apr 28, 2020;22(4):e15573. [CrossRef] [Medline]
- Offodile AC 2nd, Sen AP, Holtsmith S, et al. Harnessing behavioral economics principles to promote better surgeon accountability for operating room cost: a prospective study. J Am Coll Surg. Apr 2020;230(4):585-593. [CrossRef] [Medline]
- Paulson SS, Dummett BA, Green J, Scruth E, Reyes V, Escobar GJ. What do we do after the pilot is done? Implementation of a hospital early warning system at scale. Jt Comm J Qual Patient Saf. Apr 2020;46(4):207-216. [CrossRef] [Medline]
- Ryskina K, Jessica Dine C, Gitelman Y, et al. Effect of social comparison feedback on laboratory test ordering for hospitalized patients: a randomized controlled trial. J Gen Intern Med. Oct 2018;33(10):1639-1645. [CrossRef] [Medline]
- Sheen YJ, Huang CC, Huang SC, et al. Implementation of an electronic dashboard with a remote management system to improve glycemic management among hospitalized adults. Endocr Pract. Feb 2020;26(2):179-191. [CrossRef] [Medline]
- Bae YS, Kim KH, Choi SW, et al. Information technology-based management of clinically healthy COVID-19 patients: lessons from a living and treatment support center operated by Seoul National University Hospital. J Med Internet Res. Jun 12, 2020;22(6):e19938. [CrossRef] [Medline]
- Mulhall CL, Lam JMC, Rich PS, Dobell LG, Greenberg A. Enhancing quality care in Ontario long-term care homes through audit and feedback for physicians. J Am Med Dir Assoc. Mar 2020;21(3):420-425. [CrossRef] [Medline]
- O’Reilly-Shah VN, Easton GS, Jabaley CS, Lynde GC. Variable effectiveness of stepwise implementation of nudge-type interventions to improve provider compliance with intraoperative low tidal volume ventilation. BMJ Qual Saf. Dec 2018;27(12):1008-1018. [CrossRef] [Medline]
- Brink AJ, Messina AP, Maslo C, et al. Implementing a multi-faceted framework for proprietorship of hand hygiene compliance in a network of South African hospitals: leveraging the Ubuntu philosophy. J Hosp Infect. Apr 2020;104(4):404-413. [CrossRef] [Medline]
- Minton O, Ede C, Bass S, Tavabie S, Bourne A, Hiresche A. Hospital deaths dashboard: care indicators. BMJ Supp Palliat Care. Jun 2021;11(2):230-232. [CrossRef] [Medline]
- Romero-Brufau S, Kostandy P, Maass KL, et al. Development of data integration and visualization tools for the Department of Radiology to display operational and strategic metrics. AMIA Annu Symp Proc. 2018;2018:942-951. [Medline]
- Smith AJF, Ahern SP, Shermock KM, Feller TT, Hill JD. Assessment of impact of daily huddles and visual displays on medication delivery timeliness in a large academic medical center. Am J Health Syst Pharm. Sep 18, 2020;77(19):1585-1591. [CrossRef] [Medline]
- Reszel J, Dunn SI, Sprague AE, et al. Use of a maternal newborn audit and feedback system in Ontario: a collective case study. BMJ Qual Saf. Aug 2019;28(8):635-644. [CrossRef] [Medline]
- Weiss D, Dunn SI, Sprague AE, et al. Effect of a population-level performance dashboard intervention on maternal-newborn outcomes: an interrupted time series study. BMJ Qual Saf. Jun 2018;27(6):425-436. [CrossRef] [Medline]
- Jeffries M, Keers RN, Phipps DL, et al. Developing a learning health system: insights from a qualitative process evaluation of a pharmacist-led electronic audit and feedback intervention to improve medication safety in primary care. PLoS One. 2018;13(10):e0205419. [CrossRef] [Medline]
- Anderson BJ, Do D, Chivers C, et al. Clinical impact of an electronic dashboard and alert system for sedation minimization and ventilator liberation: a before-after study. Crit Care Explor. Oct 2019;1(10):e0057. [CrossRef] [Medline]
- Hebert C, Flaherty J, Smyer J, Ding J, Mangino JE. Development and validation of an automated ventilator-associated event electronic surveillance system: a report of a successful implementation. Am J Infect Control. Mar 2018;46(3):316-321. [CrossRef] [Medline]
- Jaimini U, Thirunarayan K, Kalra M, Venkataraman R, Kadariya D, Sheth A. “How is my child’s asthma?” Digital phenotype and actionable insights for pediatric asthma. JMIR Pediatr Parent. 2018;1(2):e11988. [CrossRef] [Medline]
- Newberry C, Saha A, Siddique SM, et al. A novel clinical decision support system for gastrointestinal bleeding risk stratification in the critically ill. Jt Comm J Qual Patient Saf. Jun 2019;45(6):440-445. [CrossRef] [Medline]
- Taber DJ, Pilch NA, McGillicuddy JW, Mardis C, Treiber F, Fleming JN. Using informatics and mobile health to improve medication safety monitoring in kidney transplant recipients. Am J Health Syst Pharm. Jul 18, 2019;76(15):1143-1149. [CrossRef] [Medline]
- Srivastava P, Verma A, Geronimo C, Button TM. Behavior stages of a physician- and coach-supported cloud-based diabetes prevention program for people with prediabetes. SAGE Open Med. 2019;7(101624744):2050312119841986. [CrossRef] [Medline]
- Stachelek GC, McNutt T, Thompson CB, Smith K, DeWeese TL, Song DY. Improvements in physician clinical workflow measures after implementation of a dashboard program. Pract Radiat Oncol. 2020;10(3):151-157. [CrossRef] [Medline]
- Patel MS, Kurtzman GW, Kannan S, et al. Effect of an automated patient dashboard using active choice and peer comparison performance feedback to physicians on statin prescribing. JAMA Netw Open. Jul 6, 2018;1(3):e180818. [CrossRef] [Medline]
- Herzke CA, Michtalik HJ, Durkin N, et al. A method for attributing patient‐level metrics to rotating providers in an inpatient setting. J Hosp Med. Jul 1, 2018;13(7):470-475. [CrossRef] [Medline]
- Palin V, Tempest E, Mistry C, van Staa TP. Developing the infrastructure to support the optimisation of antibiotic prescribing using the learning healthcare system to improve healthcare services in the provision of primary care in England. BMJ Health Care Inform. Jun 2020;27(1):e100147. [CrossRef] [Medline]
- Lenglet A, van Deursen B, Viana R, et al. Inclusion of real-time hand hygiene observation and feedback in a multimodal hand hygiene improvement strategy in low-resource settings. JAMA Netw Open. Aug 2, 2019;2(8):e199118. [CrossRef] [Medline]
- Prabhu A, Agarwal U, Tripathy JP, et al. “99DOTS”techno-supervision for tuberculosis treatment - a boon or a bane? Exploring challenges in its implementation at a tertiary centre in Delhi, India. Ind J Tuberc. Jan 2020;67(1):46-53. [CrossRef] [Medline]
- Bauer AM, Iles-Shih M, Ghomi RH, et al. Acceptability of mHealth augmentation of Collaborative Care: a mixed methods pilot study. Gen Hosp Psychiatry. 2018;51:22-29. [CrossRef] [Medline]
- Elm JJ, Daeschler M, Bataille L, et al. Feasibility and utility of a clinician dashboard from wearable and mobile application Parkinson’s disease data. NPJ Digit Med. 2019;2:95. [CrossRef] [Medline]
- Fortuna KL, Storm M, Aschbrenner KA, Bartels SJ. Integration of peer philosophy into a standardized self-management mobile health intervention. Psychiatr Q. Dec 2018;89(4):795-800. [CrossRef] [Medline]
- Almasi S, Bahaadinbeigy K, Ahmadi H, Sohrabei S, Rabiei R. Usability evaluation of dashboards: a systematic literature review of tools. Biomed Res Int. 2023;2023:9990933. [CrossRef] [Medline]
- Kaplan B, Harris-Salamone KD. Health IT success and failure: recommendations from literature and an AMIA workshop. J Am Med Inform Assoc. 2009;16(3):291-299. [CrossRef] [Medline]
- Kushniruk A, Nøhr C. Participatory design, user involvement and health IT evaluation. Stud Health Technol Inform. 2016;222:139-151. [CrossRef] [Medline]
- Carayon P, Hoonakker P. Human factors and usability for health information technology: old and new challenges. Yearb Med Inform. Aug 2019;28(1):71-77. [CrossRef] [Medline]
- Murphy DR, Savoy A, Satterly T, Sittig DF, Singh H. Dashboards for visual display of patient safety data: a systematic review. BMJ Health Care Inform. Oct 2021;28(1):e100437. [CrossRef] [Medline]
- Budd J. Burnout related to electronic health record use in primary care. J Prim Care Community Health. 2023;14:21501319231166921. [CrossRef] [Medline]
- Li C, Parpia C, Sriharan A, Keefe DT. Electronic medical record-related burnout in healthcare providers: a scoping review of outcomes and interventions. BMJ Open. Aug 19, 2022;12(8):e060865. [CrossRef] [Medline]
- Yakovchenko V, Morgan TR, Miech EJ, et al. Core implementation strategies for improving cirrhosis care in the Veterans Health Administration. Hepatology. Aug 2022;76(2):404-417. [CrossRef] [Medline]
- Damschroder LJ, Reardon CM, Widerquist MAO, Lowery J. The updated Consolidated Framework for Implementation Research based on user feedback. Impl Sci. Oct 29, 2022;17(1):75. [CrossRef] [Medline]
- Fernandez ME, Ten Hoor GA, van Lieshout S, et al. Implementation mapping: using intervention mapping to develop implementation strategies. Front Public Health. 2019;7:158. [CrossRef] [Medline]
- Pope C, van Royen P, Baker R. Qualitative methods in research on healthcare quality. Qual Saf Health Care. Jun 2002;11(2):148-152. [CrossRef] [Medline]
- Siette J, Dodds L, Sharifi F, et al. Usability and acceptability of clinical dashboards in aged care: systematic review. JMIR Aging. Jun 19, 2023;6:e42274. [CrossRef] [Medline]
- Young L, Vogelsmeier A. Quality dashboards in hospital settings: a systematic review with implications for nurses. J Nurs Care Qual. 2024;39(2):188-194. [CrossRef] [Medline]
- Rabiei R, Bastani P, Ahmadi H, Dehghan S, Almasi S. Developing public health surveillance dashboards: a scoping review on the design principles. BMC Public Health. Feb 6, 2024;24(1):392. [CrossRef] [Medline]
- Wilson AS, Triller DM, Allen A, et al. Digital dashboards for oral anticoagulation management: a literature scoping review. J Thromb Thrombol. Nov 2023;56(4):568-577. [CrossRef] [Medline]
- Toscano F, O’Donnell E, Broderick JE, et al. How physicians spend their work time: an ecological momentary assessment. J Gen Intern Med. Nov 2020;35(11):3166-3172. [CrossRef] [Medline]
- Gaffney A, Woolhandler S, Cai C, et al. Medical documentation burden among US office-based physicians in 2019: a national study. JAMA Intern Med. May 1, 2022;182(5):564-566. [CrossRef] [Medline]
- Tai-Seale M, Baxter S, Millen M, et al. Association of physician burnout with perceived EHR work stress and potentially actionable factors. J Am Med Inform Assoc. Sep 25, 2023;30(10):1665-1672. [CrossRef] [Medline]
- Krug S. Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. 1st ed. 2009. URL: https://sensible.com/rocket-surgery-made-easy/ [Accessed 2024-04-17] ISBN: 0-321-65729-2
- Rudd BN, Davis M, Beidas RS. Integrating implementation science in clinical research to maximize public health impact: a call for the reporting and alignment of implementation strategy use with implementation outcomes in clinical research. Impl Sci. Nov 25, 2020;15(1):103. [CrossRef] [Medline]
- Powell BJ, Fernandez ME, Williams NJ, et al. Enhancing the impact of implementation strategies in healthcare: a research agenda. Front Public Health. 2019;7:3. [CrossRef] [Medline]
Abbreviations
EHR: electronic health record |
PRISMA-ScR: Preferred Reporting Items for Systematic Reviews and Meta-Analyses extension for Scoping Reviews |
Edited by Christian Lovis, Jiban Khuntia; submitted 08.05.24; peer-reviewed by Helen Monkman, Hilco J van Elten; final revised version received 26.09.24; accepted 26.10.24; published 10.12.24.
Copyright© Danielle Helminski, Jeremy B Sussman, Paul N Pfeiffer, Alex N Kokaly, Allison Ranusch, Anjana Deep Renji, Laura J Damschroder, Zach Landis-Lewis, Jacob E Kurlander. Originally published in JMIR Medical Informatics (https://medinform.jmir.org), 10.12.2024.
This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Informatics, is properly cited. The complete bibliographic information, a link to the original publication on https://medinform.jmir.org/, as well as this copyright and license information must be included.